Lev Feels Fine Demo

From CDOT Wiki
Revision as of 09:38, 13 September 2010 by David.humphrey (talk | contribs) (0.1 Tasks)
Jump to: navigation, search

0.1 Overview

Due Friday Sept 17.

PAGE 1

Ask for people's twitter handle, or a hashtag. Also ask for their location

PAGE 2

  • A video is created from 3 shots - must play seamlessly, no hiccups (this is a big component of the demo)
  • The first shot will be of a face looking towards the camera - it will be chosen from happy, sad or angry. Which emotion will be determined by the users twitter ID, depending how often they mention those emotions
  • The 2nd will be a still image, chosen from a flickr feed of happy, sad or angry, chosen as above
  • The 3rd shot will be identical to the 1st

Brett to send video to use as soon as possible. Use whatever in the mean time.

0.1 Tasks

  • Twitter API
    • get people's tweets to search for mood keywords (happy, sad, angry)
  • Geolocation
    • get weather for person's location, map to background, audio
  • Flickr API
    • pull in images from Flickr for "faces"
  • Canvas/Video montage
    • video will be 480x270 pixels
    • seamless combination of video + images (e.g., use a canvas to show video and images, hiding the img and video elements)

0.2 Ideas

Once we get that done, there is some added complexity I would like to add - for instance, the 2nd shot will be an "over the shoulder" of whoever's face I record, looking at a computer. We will dynamically place the flickr image INSIDE the computer. I'd also like to group all the emotions like the 5 elements site, but I'm hoping the above text is enough to get started.