WEEK 4: video

project: "TNO CODING cLUB" -Marcel / Rae


We want to mimic the real life physical interaction as much as possible. Through locating user’s nose by poseNet’s machine learning function, we allow users to “shake” their screen and bounce with the music together in this virtual club.

Here below are something that we want to achieve in this work.


1. video for each user

2. user can choose different filter for their video by clicking filter button

3. user can move their body left and right to move their video (planB: use keypress to move)

4. song will be played once user enter the “room”

5. fancy CSS style

Our plan is that Rae works with p5, machine learning part. I do the rest. Rae has done with moving body-moving video in p5 sketch.(LINK).The problem we got into was that we don’t know how to merge p5 stuff within html.

WEEK 3: Canvas Drawing


Canvas tutorial:

Mozilla canvas tutorial (Chinese version):

How to make canvas full screen:




I have sketch so far but found difficulties with adding socket part and sending right variables in html file.


Thanks: Shawn, Emily, Guillermo for office hour, and Jason.

WEEK 2: Chat / Node.js

I made a simple live chat and I prefer call it “Chat Ocean”. What you sent will be appeared randomly on the page. And I want texts flowing in the ocean like our information on the web is flowing around the internet.  But it is too difficult…So finally came out this simple version. 

Thanks: Jason.