We want to mimic the real life physical interaction as much as possible. Through locating user’s nose by poseNet’s machine learning function, we allow users to “shake” their screen and bounce with the music together in this virtual club.
Here below are something that we want to achieve in this work.
1. video for each user
2. user can choose different filter for their video by clicking filter button
3. user can move their body left and right to move their video (planB: use keypress to move)
4. song will be played once user enter the “room”
5. fancy CSS style
Our plan is that Rae works with p5, machine learning part. I do the rest. Rae has done with moving body-moving video in p5 sketch.(LINK).The problem we got into was that we don’t know how to merge p5 stuff within html.
Mozilla canvas tutorial (Chinese version):https://developer.mozilla.org/zh-CN/docs/Web/API/Canvas_API/Tutorial
How to make canvas full screen:https://stackoverflow.com/questions/4288253/html5-canvas-100-width-height-of-viewport
Thanks: Shawn, Emily, Guillermo for office hour, and Jason.
I made a simple live chat and I prefer call it “Chat Ocean”. What you sent will be appeared randomly on the page. And I want texts flowing in the ocean like our information on the web is flowing around the internet. But it is too difficult…So finally came out this simple version.