Every year we totally redesign the course to keep up with current trends and
current events as well as to keep up with student interests and pivot toward
what each year's students are most passionate about.
Therefore previous years' course material should not be seen as limiting what
we will learn this year!
You can see some material from previous years'
ECE516 below, e.g. from a couple of years prior to the pandemic:
Wearable Computing, AI/HI (Humanistic Intelligence),
HuMachine Learning, IoTTT (Internet of Things That Think),
AR/VR (Augmediated/Virtual Reality),
and Extended Intelligence
The Birth of Wearable Computing: 41 years of Augmented Reality.
(Left-to-right):
Steve Mann, age 12, Sequential Wave Imprinting Machine (SWIM) in 1974;
Stephanie Mann, Age 9, Robot for visualizing ElectroMagnetic Wave Propagation;
Jayse Hansen, Hollywood's #1 UI designer (now a Meta employee) with Meta Glass;
Metasensing (visualizing vision and sensing sensors and their capacity to sense).
The intro video below, at the beginning of this TED talk, is by a student who delivered something creative as a first
lab. It utilized his expertise in film to contribute to the world of
Intelligence Image Processing, Augmented Reality, and Humanistic Intelligence.
Likewise, you may explore the intersection of what you are good and the world
of VR/AR, IoT, Wearables, and HuMachine Learning (AI/HI).
Lab 1 (link to info) is due 2018 January 12th, at 9am, or by special arrangement (at other time) for those with conflict.
Remember the scheduled lab times in BA3155 and BA3165 are grading sessions;
work should be completed prior to arrival.
Also remember that you're free to propose your own labs in place of any of the
standard assigned labs; just make sure to get the project approved.
2016 Lab4, opportunity to build your own EyeTap, or to
explore course material on HI (HuMachine Learning):
Pictures from lab4:
Pictures from lab5 (just back from Reading Week):
Also, feel free to start right now on your final project if you like,
as you can build up to it each week....
A good final project would be on Integral Kinematics and Integral Kinesiology:
Absement (Absition)
Intro example of Augmented Reality for lecture 1:
visualizing magnetic waves with a loop antenna; notice how the radio wave "sits" still ("sitting wave" rather than
standing wave): (link)
Multidisciplinary course with opportunities beyond the classroom:
Students participate in creating the frontier of Wearables, IoT,
and AR, by co-authoring top-tier (IEEE, ACM) papers with Prof. Mann and
developing new technologies to make the world a better place;
Former students of this course have gone on to start multimillion-dollar
multinational companies like InteraXon
and Meta;
A chance to join the world-class Humanistic Intelligence Lab
where you'll be surrounded by the world's top electrical and mechanical
engineers, physicists, mathematicians, inventors, artists,
and philosophers;
Develop first-degree connections to the top thinkers, entrepeneurs, and
investors at MIT, Stanford, Y-Combinator,
and Rotman Creative Destruction Lab;
The best students have the opportunity to present their work in
Silicon Valley, California, and have opportunities at Meta
(spaceglasses.com).
As Chief Scientist of Rotman School of Business
Creative Destruction Lab,
where investors have a
combined networth of over $2 billion, Prof. Mann
can help students with enterprise and
entrepreneurship-related efforts.
Maktivism: Making physical objects and cloud-based
computing apps. Goal: to make the world a better place.
Design technology integrating with the human body,
form, and function such as Digital Eye Glass Wearable Computing
devices for AR (Augmediated Reality)
Wearable Computing is not a $241 billion industry
Creative Final Evaluation allows students to focus on their personal
interests:
5 questions where you can get
100% by answering only one question really well.
Rather than memorizing and regurgitating the textbook, simply show
critical thinking on one topic of your choice and interest.
Thanks to Martine Rothblatt
for funding
"The Steven Mann Award for Wearable Computing", a cash prize
awarded to the top
undergraduate student in ECE516; last year's recipient,
Nima Yasrebi, received the $2500 cash prize.
A motto we live by is the IEEE's motto "Advancing Technology for Humanity"
(IEEE is the world's largest technical society),
and that was the topic of our IEEE ISTAS conference: IEEE ISTAS.
Assignment 3 for 2015:
Make a 1-pixel camera. To be described in more detail in lecture of Monday
2015feb2.
Assignment 4 for 2015:
Calibrate your 1-pixel camera as described in lecture Monday 2015feb09.
See "Photocell experiment" below.
In particular, create a comparagraph of f(kq) vs. f(q),
with well-labelled axes, data points and rigourously defined variables.
Take one data set while trying to exactly double/halve the quantity of light,
and graph it. Graph another data set while changing the quantity of light by
steps in a different ratio. How would you fit a function to this relationship?
Is it possible to figure out the original function f(q) v.s. q?
What does this represent?
Bonus marks for doing this with AC (alternating current) signals and
quadrature detection (e.g. building an oscillator and detector circuit).
See University of Colorado, Phyics 3340 for example.
We also have some wave analyzers as well as the
SR510 lock-in amplifier available.
Bonus marks still available for feedbackography, but this time, let's
"raise the bar" a bit (in fairness to those who got this working last time)
and get the feedback extending over a further range (e.g. greater distance
from the camera with a good visible image).
See an example here,
and also here's some info on
animations
in .gif images.
Inventrepreneurship:
S. Mann's role as the Chief Scientist at
Rotman School of Management's Creative Destruction Lab brings us
a series
of inventions we can learn from and work with; ask Prof. Mann for the web link
URL.
ECE516 (formerly known as ECE1766), since 1998 (2018 is its 21st year)
Teaching assistants:
Max Lv (Lu) Hao: EA302
Sen Yang: EA302
Schedule for January 2018:
One hour lectures starting 11am on Mon, Wed, and Thursday:
coursefinder
Lec 0101 MONDAY 16:00-17:00 GB 120
Lec 0101 TUESDAY 12:00-13:00 GB 120
Lec 0101 THURSDAY 16:00-17:00 GB 120
Pra 0101 FRIDAY 09:00-12:00 BA 3165 and BA 3155
Office hours: Mon. 17h, Tues 13h, Thurs, 17h, in EA302 or SF2001 office.
Three office hours/week: the hour immediately following each lecture.
Office hours are Mon. 5pm, Tuesday 1pm, and Thursday 5pm, EA302, or in the classroom if available, or, for privately scheduled meetings, in Prof. Mann's office = SF2001.
Lab: Mon. 3pm to 6pm BA3155 and 3165, or EA302 or alternate location depending on subject of lab
Important dates 2015 (to be updated 2016):
2015mar08: last day for undergradute engineering students to drop a course.
2015apr10 = last day of classes
2015apr14-29 = exams
For CS grad students, see
http://web.cs.toronto.edu/program/currentgradstudents/gradprogram/2014-15courselisttimetable.htm
Each year this course is taught, times can be verified from the official schedule at: APSC 2010 Winter Undergraduate Timetable (this URL seems to have remained constant for a number of years now).
Exam schedule subject to change;
for the latest, check http://www.apsc.utoronto.ca/timetable/fes.aspx
As an example of a typical exam time and type, in a previous year, the exam was:
"Manoel lives in California with his wife and children.
He admires Dr. Steve Mann,
who is considered the real father of wearable computers,
and David Rolfe, a notable Assembler programmer who created classic
arcade games in the 1980s.", Page xix
Labs were organized according to these six units (the first unit on
KEYER, etc., includes more than one lab, because there is some intro
material, background, getting started, etc.).
Organization of the course usually follows the six chapters in the
course TEXTBOOK, but if you are interested in
other material please bring this to the attention of the course instructor
or TA and we'll try and incorporate your interests into the course design.
location of this course textbook in university of toronto bookstore:
Kevin reported as follows:
I just stopped by the UofT Bookstore, and to help the rest of the
students, I thought you could announce that the book is located in the
engineering aisle, and exactly to the left of the bookstore computer
terminal behind some Investment Science books.
Course summary:
ECE516 is aimed primarily at third and fourth year undergraduates, and
first year graduate students. 4th year undergraduates
often take this course as their
"other technical elective" (fourth year elective).
The classes are comprised of lectures and labs (labs have both a tutorial
component and a grading component, etc.)
starting in January, along with a final exam in April.
The course provides the student with the fundamental knowledge needed
in the rapidly growing field of Personal Cybernetics
("minds and machines", e.g. mind-machine interfaces, etc.)
and Personal Intelligent Image Processing. These topics are
often referred
to colloquially as ``Wearable Computing'', ``Personal Technologies'',
``Mobile Multimedia'', etc..
The course focuses on the future of computing and
what will become the most important
aspects of truly personal computation and communication.
Very quickly we are witnessing a merging of communications devices
(such as portable telephones) with computational devices (personal
organizers, personal computers, etc.).
The focus of this course is on the specific and fundamental aspects of
visual interfaces that will have greatest relevence and impact,
namely the notion of a computationally mediated reality,
as well as related topics such as Digital Eye Glass,
brain-computer interfaces (BCI), etc.,
as explored in collaboration with some of our startups, such as
Meta, and
InteraXon,
a spinoff company started by former
students from this course.
A computationally mediated reality is a natural extension
of next--generation computing.
In particular, we have witnessed a pivotal shift from mainframe computers
to the personal/personalizable computers owned and operated by individual
end users. We have also witnessed a fundamental change in the nature of
computing from large mathematical calculations, to the use of computers
primarily as a communications medium. The explosive growth of the
Internet, and more recently, the World Wide Web, is a harbinger
of what will evolve into a completely computer--mediated world in
which all aspects of life, not just cyberspace, will be online and
connected by visually based content and visual reality user interfaces.
This transformation in the way we think and communicate will not
be the result of so--called ubiquitous computing
(microprocessors in everything around us).
Instead of the current vision of
``smart floors'', ``smart lightswitches'', ``smart toilets'',
in ``smart buildings''
that watch us and respond to our actions,
what we will witness is the emergence of ``smart people'' ---
intelligence attached to people, not just to buildings.
And this will be done,
not by implanting devices into the brain, but, rather,
simply by non--invasively ``tapping'' the highest bandwidth ``pipe''
into the brain, namely the eye. This so--called ``eye tap'' forms the
basis for devices that are currently built into eyeglasses
(prototypes are also being built into contact lenses) to tap into the
mind's eye.
EyeTap technology causes inanimate objects to suddently come to life
as nodes on a virtual computer network. For example, while
walking past an old building, the building may come to life with
hyperlinks on its surface, even though the building is not wired
for network connections in any way. These hyperlinks are merely a
shared imagined reality that wearers of the EyeTap technology
simultaneously experience.
When entering a grocery store, a milk carton may come to life,
with a unique message from a spouse, reminding the wearer of the
EyeTap technology to pick up some milk on the way home from work.
EyeTap technology is not merely about a computer screen inside
eyeglasses, but, rather, it's about enabling what is, in effect,
a shared telepathic experience connecting multiple individuals together
in a collective consciousness.
EyeTap technology will have many commercial applications, and emerge
as one of the most industrially relevant forms of communications
technology.
The WearTel (TM) phone, for example, uses EyeTap technology to allow
individuals to see each other's point of view.
Traditional videoconferencing
merely provides a picture of the other person.
But most of the time we call
people we already know, so it is far more useful for use to exchange
points of view. Therefore, the miniature laser light source inside the
WearTel eyeglass--based phone scans across the retinas of both parties
and swaps the image information, so that each person sees what the other
person is looking at. The WearTel phone, in effect, let's someone
``be you'',
rather than just ``see you''. By letting others put themselves
in your shoes and see the world from your point of view, a very powerful
communications medium results.
The course includes iPhone and Android phone technologies, and
eyeglass-based "eyePhone" hybrids.
online materials and online examples, using the GNU Octave
program, as well as other parts of the GNU Linux operating system
(most of us use Ubuntu, Debian, or the like).
The course will follow very closely to the textbook which is organized into
these six chapters:
Personal Cybernetics:
The first chapter introduces the general
ideas of ``Wearable Computing'', personal technologies, etc.
See http://wearcam.org/hi.htm.
Personal Imaging:
(cameras getting smaller and easier to carry),
wearing the camera (the instructor's
fully functioning XF86 GNUX wristwatch videoconferencing system,
http://wearcam.org/wristcam/);
wearing the camera in an "always ready" state
Mediated Reality and the EyeTap Principle.
Collinearity criterion:
The laser EyeTap camera: Tapping the mind's eye: infinite depth of focus
Contact lens displays, blurry information displays,
and vitrionic displays
Lecture, lab, and tutorial schedule from previous years
Here is an example schedule from a previous year the course was taught.
Each year we modify the schedule to keep current with the latest research
as well as with the interests of the participants in the course. If you have
anything you find particularly interesting, let us know and we'll consider
working it into the schedule...
Week1 (Tue. Jan. 4 and Wed. Jan. 5th):
Humanistic Intelligence for Intelligent Image Processing
Humanistic User Interfaces, e.g. "LiqUIface" and other novel
inputs that have the human being in the feedback loop of a
computational process.
Week2: Personal Imaging; concomitant cover activity and
VideoClips; Wristwatch videophone; Telepointer,
metaphor-free computing, and Direct User Interfaces.
Week4: EyeTap part1;
technology that causes the eye itself to
function as if it were both a camera and display;
collinearity criterion; Calibration of EyeTap
systems; Human factors and user studies.
Week5: Eyetap part2; Blurry information displays;
Laser Eyetap; Vitrionics (electronics in glass);
Vitrionic contact lenses.
Week10: Lightspace and anti-homomorphic vectorspaces.
Week11: VideoOrbits, part1; background
PDC intensive course may also be offered around this time;
Week12: VideoOrbits, part2; Reality Window Manager (RWM);
Mediated Reality; Augmented Reality in industrial
applications; Visual Filters; topics for further
research (graduate studies and industrial
opportunities).
Week13; review for final exam;
Final Exam: standard time frame usually sometime between around
mid April and the end of April.
This course was originally offered as ECE1766; you can see previous
version (origins of the course),
http://wearcam.org/ece1766.htm
for info from previous years.
Above: one of our neckworn sensory cameras, designed and built
at University of Toronto, 1998, which later formed the basis
for Microsoft's sensecam.
CyborGLOG of Lectures from previous year
(...on sabbatical 2009, so last year the course was not offered. Therefore,
the must up-to-date previous course CyborGLOG was from 2008.)
Tue 2008 Jan 08:
My eyeglasses were recently broken (damaged) when I fell into a live
three-phase power distribution station that was for some strange
reason setup on a public sidewalk by a film production company.
As a result, my eyeglasses are not working to well. Here is
a poor quality but still somewhat useful (understandable) capture of the
lecture
as transmitted live (archive) (please forgive the poor eyesight
resulting from temporary replacement eyewear).
Christina Mann's fun guide: How to fix things,
drill holes, install binding posts, and solder wires to terminals
Material from year 2007:
Lab 2007-0: Demonstration of an analog keyboard
Example of analog keyboard; continuous fluidly varying input space:
Lab 2007-1, Chapter 1 of textbook: Humanistic Intelligence
In lab 1 you will demonstrate your understanding of Humanistic Intelligence,
either by making a keyer, or by programming an existing keyer,
so that you can learn the overall concept.
Choose one of:
Build a keyer that can be used to control a computer, camera,
music player, or other multimedia device. Your keyer can be modeled
after the standard Twiddler layout.
You can take a look at
http://wearcam.org/ece516/musikeyer.htm
to get a rough sense of what our typical keyers look like.
See also pictures from last year's class, of various keyers that
were made, along with the critiques given of each one.
Your keyer should have 13 banana plugs on it: one common and 12 others,
one of these 12 for each key.
If you choose this option, your keyer will be graded for
overall engineering (as a crude and simple prototype),
ergonomics, functionality, and design.
Modify one or more of the programs on an existing keyer to demonstrate
your understanding of these keyers.
Many of our existing keyers use the standard Twiddler layout,
and are programmed using the C programming language to program
one or more Atmel ATMEGA 48 or ATMEGA 88 microcontrollers.
If you choose this project, please contact Prof. Mann or T.A. Ryan Janzen,
to get a copy of the C program, and to discuss a suitable modification
that demonstrates your understanding of keyers.
Assemble a "blueboard" (we supply printed circuit board and
pre-programmed chips). Should have experience soldering DIPs, etc..
The blueboard is the standard interface for most wearable computers
and personal cybernetics equipment, and features 15 analog inputs
for 12 finger keys and 3 thumb keys.
If you choose this option, please contact Prof. Mann or T.A. Ryan Janzen,
to determine materials needed.
Implement a bandpass filter. You can do this using a suitable personal
computer that has a sound card, that can record and play sound
(at the same time). You can write a computer program yourself,
or find an existing program written by someone else that you can use.
Your filter should monitor a microphone input,
filter it, and send the output to a speaker.
The filter should be good enough that it can be set to a particular
frequency, for example, 440Hz, and block sounds at all but that
frequency. Any sound going into the microphone should be audible
at that specific frequency only, i.e. you should notice a musical or
tonal characteristic in the output when the input is driven with
arbitrary or random sound.
Ideally we would have at least one person doing each part of this project
so that we can put a group together for the entire result (keyer).
Lab 2007-2, Chapter 2 of textbook: Eyeglass-based display device
In this lab we will build a simple eyeglass-based display device,
having a limited number of pixels, in order to understand the concept
of eyeglass-based displays and viewfinders.
This display device could function with a wide range of different kinds of
wearable computing devices, such as your portable music player.
Today there were two really cool projects that deserve mention in the
ECE516 Hall of Fame:
David's comparametic analysis and CEMENTing of telescope images:
Peng's tone generator:
Lab 2007-5, Chapter 5 of textbook: Lightvectors
Lab 2007-6 and 7
Final projects:
something of your choosing, to show what you've learned so far.
No written or otherwise recorded report is required.
However, if you choose to write or record some form of report or
other support material, it need not be of a formal nature, but
you must, of course, abide by good standards of academic conduct,
e.g. any published or submitted material must:
properly cite reference source material (e.g. where any ideas that you
decide to use came from);
properly cite collaborations with others.
You are free to do individual projects or group projects,
and free to discuss and collaborate even if doing individual projects,
but must cite other partners, collaborators, etc..
If you choose not to provide a written report, but only to demonstrate
(verbal report, etc.), in the lab, you still need to state your
source and collaboration material.
It is expected that all sudents will have read and agree to the terms
of proper academic conduct. This usually happens and is introduced
in first year, but for anyone who happens to have missed it in earlier years,
it's here:
How Not to Plagiarize.
It's written mainly to apply to writing, but the ethical concept is
equally applicable to presentations, ideas, and any other representation
of work, research, or the like.