peer review versus supervisory review:

here's one possible suggestion: "Member Evaluation" at the end of the term that allows each member to grade the other members on their performance.

possible ways to overcome the problem of distinguishing popularity from technical capability?

science, scientific literature, competition...


lab8: microprocessors for new modes of communication

the purpose of this lecture is to introduce lab8, the last lab before final project lab.

you must come to lab prepared with a thorough understanding of one of the quickcam programs available for gnu linux. you will be expected to know how to use the quickcam right away at the beginning of the lab. please also study the quickcam thoroughly (e.g. horizontal field of view, vertical field of view, computer programs available on the internet to support it, image sizes it can produce, etc.) prior to coming to the lab.


the nature of microprocessors is rapidly changing.

microprocessors were originally designed for calculations (A). the world's first microprocessor, the intel 4004, was designed for busicom, a manufacturer of electronic calculators.

in today's era, however, microprocessors are used primarily for communication (B). for example, you're more likely to use microprocessors for document preparation, placing a phone call, use of multimedia services, watching television, or making a purchase. these are all forms of communication.


multimedia communication: more than just text, graphics, and sound.

personal imaging is a growing field of research that involves the use of image capture devices.

the simplest image capture device is probably the connectix QuickCam (TM), which is what we'll be learning about in the labs.

there is lots of info on the quickcam; just do a www search or the like.

there's even a www page on how to disassemble a quickcam.


telepoint

one new and emerging form of communication is called telepresence.

instead of just talking like you do on the telephone, telepresence involves a sense of shared reality, or "being there".

videoconferencing tried to provide telepresence, but failed.
(nearly every commercial videoconferencing offering failed to obtain widespread acceptance, yet standard voice telephones are ubiquitous).

just seeing somebody doesn't provide a good sense of collaborative capability.

the goal of this lab is to illustrate the use of microprocessors in modern communications, in particular, telepresence.

you will implement a remote pointing device called "telepoint" (also known as a "telepointer").

basically, you shine the laser at a screen in one location, and it "comes out" in another location that could be hundreds of miles away.

telepoint can be used to replace the mouse and keyboard of a traditional computer, and is much more intuitive to use than a computer mouse.


first, i introduce a new device called an "aremac" (for those interested in the etymology of this new word, try spelling camera backwards).

the aremac is something you can make from 2 servos and a laser pointer, and it scans out the scene kind of like the cathode ray tube does, except it can point at 3d scenes instead of the flat screen of a television.

an aremac is kind of like a tv set that displays onto 3d objects instead of a flat screen.


FIG. 1 illustrating the aremac in relation to other known devices.



FIG. 1 is a tabular figure defining the aremac in relation to known devices, the known devices being the scanner, the projector, and the camera. There are various kinds of scanners. Some scanners work in a manner similar to photocopiers while others comprise a sensor array mounted in a box on a copy stand where a flat object can be placed. For purposes of explanation, consider the copy--stand embodiment of the scanner. The copy--stand embodiment of the scanner, depicted in the figure, is commonly used to record the image from a flat object such as the page of a book, 110, by way of light 112 bouncing off the flat object, and entering a lens 114, into the scanner body 116. The scanner receives and records light from a two dimensional (2D) object. The projector transmits and displays light onto a 2D object. A projector 120 is typically fitted with a lens 122, which directs light 124 onto a projection screen, or flat wall (usually light in color) 126. The camera receives and records light from one or more three dimensional (3D) objects. Objects 130 scatter ambient light from the environment, or light from artificial sources, 132, and lens 134 attached to camera 136 forms an image of the objects 130 inside the camera 136, where the image is recorded or transmitted to a remote location for storage or remote observation. A camera may take pictures of 2D objects like the scanner does, but it is important to realize that the camera has sufficient depth of field to capture pictures of 3D objects. The aremac 140 typically comprises optics 142 which direct light 144 at a 3D scene 146. In this way the aremac is to the camera as the projector is to the scanner. Similarly, the aremac may project light onto 2D or 3D scenes, but it is important to realize that the aremac has sufficient depth of field to project onto 3D objects and scenes.

FIG. 2 illustrating the use of the telepointer for telepresence collaboration

bold parts indicate the parts you'll put together in the lab, and these include the QuickCam (denoted QC), the wire from QC to the "WearComp", and the aremac 140 which will be formed from your /dev/pwm0 and /dev/pwm1.

FIG. 2 depicts the aremac 140 as part of a system which facilitates visual communication and collaboration.

This collaboration takes place between a person standing in the vicinity of aremac 140, and another person, perhaps thousands of miles away, standing in front of a video projector 120.

Objects 210 (e.g. any objects within the field of illumination of aremac 140) scatter light from aremac 140, so that the output of aremac 140 is visible to a person in the vicinity of objects 210.

Objects 140 are also visible at a remote site, by way of a portion of scene light deflected by beamsplitter 220 to camera 136, where an image is recorded and transmitted, typically by a radio transmitter 230, into transmitting antenna 232.

A person, hereafter referred to as ``the photographer'' (without loss of generality, e.g. whether or not the task said person is engated in is photography), in or near the scene where objects 210 are located interacts with a remote manager while viewing objects 210.

The signal from camera 136 is sent by way of a radio transmitter, by telephone lines, computer network, or the like, to a remote, possibly distant location, where it is routed to projector 120. Emanating from projector 120 there are rays of light 252 which reach beamsplitter 254 and are partially reflected as rays 256 which are considered wasted light. However, some of the light from projector 120 will pass through beamsplitter 254 and emerge as light rays 258. The projected image thus appears upon screen 260.

A second person, hereafter, referred to as the photographer's manager or assistant, without intended loss of generality (e.g. regardless of whether the task to which assistance or guidance is being offered is the task of photography or some other task), can observe the scene 210 on screen 260, and can point to objects in the scene 210, by simply pointing to various parts of the screen 260. Camera 237 can also observe the screen 260, by way of beamsplitter 254, and this image of the photographer's manager or assistant pointing at objects in the scene is transmitted back to aremac 140. In order to prevent there from being video feedback, there is, a polarizer 280 in front of camera 237, oriented to pass light from the manager. Insofar as beamsplitter 254 may or may not fall at exactly Brewster's angle --- the angle of maximum polarization, a second polarizer 282 is provided in front of screen 260, whereby polarizers 280, 282, along with the angle of beamsplitter 254 (and correspondingly, keeping camera 237 properly oriented), are adjusted to minimize video feedback, and maximize the quality of the image from the manager. The light, 290, emanating from aremac 140, hits beamsplitter 220, and some is lost as waste light 292. The rest of the light, 294, that passes through beamsplitter 220, illuminates the scene 210. Thus photographer 240 sees the image of the manager cast upon objects in the scene 210. Although this image of the manager will appear disjoint in the photographer's direct view of objects 210, the photographer's view of objects 210 as seen by camera 136, projected into display 244 will appear as a coherent view of the manager and gestures such as pointing at particular objects in scene 210. This coherence and continuity of images as seen in display 244 is due to the same principle by which a spotlight operator always sees the circular shape of the spotlight even when projecting onto oblique or disjoint surfaces. The shared view facilitates collaboration, which is especially effective when combined with a voice communications capability as might be afforded by the use of a wearable hands--free cellular telephone used together with the visual collaboration apparatus. Alternatively, the photographer's portion of the voice communications capability can be built into the head mounted display 244, and share a common data communications link, for example, having voice, video, and data communications routed through a body worn computer system attached to photographer 240 and linked to the system of the manager by way of a wireless data communications network.


FIG. 3 illustrating how telepointing works to control an aremac with a laser pointer.



FIG. 3 depicts a manager's office 300 remotely connected to a photographer's studio 301. This connection may be by wire, telephone, radio, satellite communications, fiber optics, or the like. Objects as part of scene 210 in the photographer's studio are seen as objects 310 on a large projection screen 315 at the front of the manager's office. The manager is sitting at a desk, watching the large projection screen 315, and pointing at this large projection screen 315 using a laser pointer. She notices that one of the objects in the scene is slightly out of focus, and not well illuminated, so she points her laser pointer at this object upon screen 315. The laser pointer makes a bright red dot 320 on the screen. A camera 330 in the manager's office points at the screen 315 in such a way that the field of view of camera 330 matches that of the photographer's camera. Since the photographer's camera is displayed on screen 315, camera 330 can easily be made to match this field of view by building camera 330 into the projector that displays on screen 315.

The video signal output of screen camera 330 is connected to a vision processor (e.g. a 486 based "wearcomp") 340 which simply determines the coordinates of the brightest point in the image seen by camera 330 if there is a dominant brightest point. Camera 330 does not need to be a high quality camera since it will only be used to see where the laser pointer is pointing. A cheap black and white QuickCam (TM) will suffice for this purpose.

Selection of the brightest pixel will tell us the coordinates, but a better estimate can be made by using vision processor 340 to determine the coordinates of a bright red blob 320 to sub--pixel accuracy. This would help reduce the resolution needed, so that smaller images could be used (which load more quickly over the QuickCam's parallel port).

These coordinates as signals 350 and 351 are received at the photographer's studio 301 and are fed to a galvo drive mechanism (servo) which controls two galvos (e.g. Futaba (TM) hobby servos, or the like). Coordinate signal 350 drives azimuthal galvo 380 while coordinate signal 351 drives elevational galvo 381. These galvos are calibrated by the galvo drive unit 360 so that aremac laser 370 is directed to form a red dot 321 on the object in the photographer's studio 301 that the manager is pointing at from her office 300. Aremac laser 370 together with galvo drive 360 and galvos 380 and 381 together comprise a device called an aremac which may be built into the photographer's camera so that they will be properly calibrated. This aremac may alternatively be housed on the same mounting tripod as the photographer's camera, where the two may be combined by way of beamsplitter. If it is not practical or desirable to use a beamsplitter, or it is not practical to calibrate the entire apparatus, the manager may use an infrared laser pointer so that she cannot see the dot formed by the laser pointer. In this case, she will look at the image of the red dot that is captured by the photographer's camera so that what is seen by her as dot 320 on screen 315 is by way of her ability to look through the photographer's camera. Note that in all cases, the laser beam in the photographer's studio will be in the visible portion of the spectrum (e.g. red and not infrared). In this way, her very act of pointing will cause her own mind and body to close the feedback loop around any reasonable degree of misalignment or parallax error in the entire system.


FIG. 4 showing how the one side of the communications system works



FIG.~4 depicts a portable hand--held or wearable embodiment of the invention which does not need to be worn upon the head where it would cover eye of the user. Camera 410 which views the scene through beamsplitter 420 sends video to a motion stabilization system 430. The stabilized video signal from stabilization system 430 is sent to a remote director by inbound transmitter 440. At a remote location, the remote director displays video received from transmitter 440 on a large screen video projector. The remote director points to objects in the scene by pointing at the screen with a laser pointer. A scanner in the director's office scans the screen to determine where the director is pointing, and these coordinates are sent back to be received by outbound receiver 450. These coordinates are converted back to the same coordinates as the camera 410. This conversion process is done by motion destabilizer 460 which does the inverse operation of what the motion stabilizer 430 does, possibly with a time lag (e.g. undoes what the motion stabilizer recently did). The coordinates, in destabilized form (e.g. in the coordinates of camera 410) direct aremac 470 to point at the corresponding object in the scene. Thus when the remote director points at an object on her screen, by using her laser pointer, the same object appears to the photographer as having a red dot appear upon that object at the same location. Thus, for example, if a remote spouse is remotely watching what her husband is pointing the apparatus at, she can see the video on her screen, and point at an object in view of the camera, causing aremac 470 to point at this object. This functionality (teleoperation of a laser pointer with a laser pointer as an input device) is called telepointing, and the apparatus shown in Fig~4 is an example of a telepointing means. Typically, the apparatus of Fig~4 will be housed inside a cellular telephone which becomes the communications channel 440 and 450. This facilitates voice communication, and allows the photogrpaher to point the camera at objects in the scene, where, for example, a remote spouse can telepoint to objects such as one of the levers on the steering column of a new car that her husband is shopping for.

grading

  1. get the quickcam working. it would be a good idea to familiarize yourself with quickcam tonight so that you're aware of the gnux (gnulinux) utilities are out there for the standard black and white quickcams we'll be using.

    this mark is awarded for showing us that you can download (from the internet, or the like), install, and use any suitable quickcam program to take a picture with the quickcam.

  2. write a short program to adjust the parameters of the quickcam interactively. these are ordinarily set in a file, the location of which depends on which quick cam program you download from the internet. it might look something like /etc/qcam.conf, for example.
  3. write a short program to determine the pixel coordinates of the brightest point in an image.

    your program should accept as input an image from the quickcam, and output two numbers corresponding to the coordinates of the brightest pixel. your output numbers should both be scaled from 0 to 255, so that regardless of the size of the image you select, the numbers you get can be used by one of your previously written device drivers that take an unsigned character input. (image size and brightness, etc., is adjusted by editing /etc/qcam.conf or the like).

    you should be able to adjust the brightness of the quickcam image so that it is all black, or nearly so, until a laser pointer is used to shine a point in its field of view, in which case it should easily detect this bright spot. grading will be done by shining a laser pointer at a point in the camera's field of view and seeing if your program reports the coordinates of the bright red dot made by the laser pointer.

    you're welcome to do something creative like put a red filter over the camera to make it more red sensitive, but this will likely not be necessary.

  4. make /dev/pwm0 and /dev/pwm1 responsive to the coordinates of the brightest pixel in the image, as above.

    in this way, two servos, one mounted horizontally, the other vertically, with small mirrors attached to each, will serve to deflect a laser beam in horizontal and vertical directions.

    you will need to change your duty cycle of /dev/pwm1 to match that of /dev/pwm0 (e.g. change it from range 0 to 1 to range .1 to .2, since it will be driving a second servo instead of an LED).

  5. survivability directorate

lab6 solutions

this lab needs the results of lab6, so for those who had trouble with lab6, the solutions to lab6 (e.g. the original programs that haven't had lines deleted from them) are available here.

lab8 programs

the programs for lab8 are located here

final project lab

you might want to extend this lab (lab8) as your final project (lab9). for example, you might want to make a real telepoint system out of this lab, or extend it in various other ways.
A suggestion to create incentive for students to do pre-lab:
============================================================
Create a take-home pre-lab questionaire due at the beginning of the lab.
This 5 question questionaire is handed out to each student and can only
be answered by reading and preparing for the pre-lab.

An Individual Pre-lab Questionaire:
===================================
0) What is the URL where you can find the qcam files that we are using for
lab8?
# http://www.eyetap.org/ece385/lab8/programs/

1) What is the default screen size in pixels that the qcam outputs?
# found in qcan.conf
# default width 160 and height 120

2) What is the name of the file format of that qcam outputs and how is this
indicated in the file?
# qcam output a PGM file
# indicated by P5

3) What are the two formats found in the output file of qcam?
# first three lines are ascii
# remainder of file is in binary

4) What is the name of the user defined function that writes the scan buffer
out to a file and what is the c function call it uses?
# function qc_writepgm()
# calls: fputc(scan[i],f);

5) What is the name of the variable in the qcam structure that holds the
value of the width of the output image and how do you call it?
# from file qcam.h:
# struct qcam {  int width, height;
# called with: q->width
Albert Tam of 1999 class