Microsoft SenseCam

From HandWiki
Revision as of 17:58, 6 February 2024 by MainAI (talk | contribs) (fix)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
SenseCam img 3367.jpg
Sensecam as typically worn, in comparison with its predecessor (Wearable Wireless Webcam) and its successor (Memoto)

Microsoft's SenseCam is a lifelogging camera with a fisheye lens and trigger sensors, such as accelerometers, heat sensing, and audio, invented by Lyndsay Williams, a patent[1] granted in 2009. Usually worn around the neck, Sensecam is used for the MyLifeBits project, a lifetime storage database. Early developers were James Srinivasan and Trevor Taylor.

Earlier work on neck-worn sensor cameras with fisheye lenses was done by Steve Mann, and published in 2001.[2][3]

Microsoft Sensecam, Mann's earlier sensor cameras, and subsequent similar products like Autographer, Glogger and the Narrative Clip are all examples of Wearable Computing.[4]

Sensecam photo of King's College, Cambridge
SenseCam prototype circa 2003

Wearable neck-worn cameras contribute to an easier way of collecting and indexing one's daily experiences by unobtrusively taking photographs whenever a change in temperature, movement, or lighting triggers the internal sensor. The Sensecam[5] is also equipped with an accelerometer, which is used to trigger images and can also stabilise images so as to reduce blurriness. The camera is usually worn around the neck via a lanyard.

The photos represent almost every experience of its wearer's day. They are taken via a wide-angle lens to capture an image likely to contain most of what the wearer can see. The SenseCam uses a flash memory, which has the means to store upwards of 2,000 photos per day as .jpg files, though more recent models with larger and faster memory cards mean a wearer typically stores up to 4,000 images per day. These files can then be uploaded and automatically viewed as a daily movie, which can be easily reviewed and indexed using a custom viewer application running on a PC. It is possible to replay the images from a single day in a few minutes.[5] An alternative way of viewing images is to have a day's worth of data automatically segmented into 'events' and to use an event-based browser which can view each event (of 50, 100 or more individual SenseCam images) using a keyframe chosen as a representative of that event.

SenseCams have mostly been used in medical applications, particularly to aid those with poor memory as a result of disease or brain trauma. Several studies have been published by Chris Moulin, Aiden R. Doherty and Alan F. Smeaton[6] showing how reviewing one's SenseCam images can lead to what Martin A. Conway, a memory researcher from the University of Leeds, calls "Proustian moments",[7] characterised as floods of recalled details of some event in the past. SenseCams have also been used in lifelogging, and one researcher at Dublin City University, Ireland, has been wearing a SenseCam for most of his waking hours since 2006 and has generated over 13 million SenseCam images of his life.[8]

In October 2009, SenseCam technology was adopted by Vicon Revue and is now available as a product.[9]

There is a wiki dedicated to SenseCam technical issues, software, news, and various research activities and publications about, and using, SenseCam.[10]

Projections

SenseCam img 3366.jpg

Microsoft Research has contributed a device to aid lifebloggers among several potential users. SenseCam was first developed to help people with memory loss, but the camera is currently being tested to aid those with serious cognitive memory loss. The SenseCam produces images very similar to one's memory, particularly episodic memory, which is usually in the form of visual imagery.[11] By reviewing the day's filmstrip, patients with Alzheimer's, amnesia, and other memory impairments found it much easier to retrieve lost memories.

Microsoft Research has also tested internal audio level detection and audio recording for the SenseCam, although there are no plans to build these into the research prototypes at the moment. The research team is also exploring the potential of including sensors that will monitor the wearer's heart rate, body temperature, and other physiological changes, along with an electrocardiogram recorder when capturing pictures.

Other possible applications include using the camera's records for ethnographic studies of social phenomena, monitoring food intake, and assessing an environment's accessibility for people with disabilities.[12]

See also

References

Further reading