This mode grabs live video from a USB webcam. There are some triggers and motions that mess with the image.
I developed this on an Organelle M running EYESY OS 2.1.
(Note: if there is no camera connected, a static image is used instead)
This mode grabs live video from a USB webcam. There are some triggers and motions that mess with the image.
I developed this on an Organelle M running EYESY OS 2.1.
(Note: if there is no camera connected, a static image is used instead)
You must be logged in to post a comment.
privacy policy | contact us | discord | api | github issues
Very cool idea. I am just wondering how a USB webcam is going to be accessible via the Eyesy wireless connection. Can you give more context there? This would seem to imply that the Eyesy has to be on the same network as the rest of your setup, and so far I have not seen a way to do that (using only AP mode).
Not sure if I misunderstand the question: the USB Webcam is plugged in directly in the Eyesy. Since the Eyesy seems to have only one USB port (the Organelle M has two), you either have to use the Webcam instead of the Wifi dongle, or use a (powered?) USB hub, to plug in both at the same time.
Have fun!
@velolala this is a great idea! I can’t seem to get it working though. I am using a UVC 1.0 webcam, but it never recognizes it. I’ve tried connecting it before and after startup, no success. any tips?
@jjdeprisco you can find how to connect the Eyesy to your network in the manual section 4.1. OS 2.1 fixed some issues in the settings when adding your network info.
@brothervsrobot Thanks for letting me know!
Does the camera show up in `v4l2-ctl –all` if you ssh onto your on your instrument? I believe the `pygame.camera` module needs a `v4l2` compatible camera, so this would be the first step for debugging it.
Btw: I have made the following changes to `EYESY_OS/engines/python/main.py` in order to get some more information in case of problems in `/tmp/video.log`:
diff a/EYESY_OS/engines/python/main.py b/EYESY_OS/engines/python/main.py
91c91
< except :
—
> except Exception as e:
92a93
> print e
165,166c166,167
< except :
< etc.error = “Mode ” + etc.mode + ” not loaded, probably has errors.”
—
> except Exception as e:
> etc.error = “Mode ” + etc.mode + ” not loaded, probably has errors.” + str(e)
I tried `v4l2-ctl –all` and it showed my webcam, but also said ‘not using libv4l2’. would that be the issue? I put the driver info below if it helps. I also made the changes to the python engine you listed as well.
can you tell me what webcam you are using? are there specific cameras that are compatible with v4l2?
Driver Info (not using libv4l2):
Driver name : uvcvideo
Card type : Q2n-4K Webcam 720p: Q2n-4K Webc
Bus info : usb-3f980000.usb-1.3
Driver version: 4.14.98
Capabilities : 0x84200001
Video Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
Width/Height : 1280/720
Pixel Format : ‘MJPG’
Field : None
Bytes per Line : 0
Size Image : 524288
Colorspace : sRGB
Transfer Function : Default
YCbCr/HSV Encoding: Default
Quantization : Default
Flags :
Crop Capability Video Capture:
Bounds : Left 0, Top 0, Width 1280, Height 720
Default : Left 0, Top 0, Width 1280, Height 720
Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 1280, Height 720
Selection: crop_bounds, Left 0, Top 0, Width 1280, Height 720
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 10.000 (10/1)
Read buffers : 0
Webcam not working here either.
Could this be an issue related to the hardware differences between an Eyesy and the Organelle M?
hmm, I created a little test script https://gist.github.com/velolala/328a31f2b4b27684dcce848cd494b7ee
If you run it on the device while you stopped the main EYESY program (i.e. after `Stop Video` in the web interface), it should print something like this:
music@eyesy:/tmp $ wget https://gist.githubusercontent.com/velolala/328a31f2b4b27684dcce848cd494b7ee/raw/d6ef4c3b5e865d3b926eb88d9c2b00ccc4ffc7d1/test_webcam.py
music@eyesy:/tmp $ python test_webcam.py
camera is ‘/dev/video1’ resolution ‘(160, 120)’
camera stop
@Syntheist that’s too bad. I actually don’t know the hardware differences between the two, but, since I can use an unmodified Eyesy_OS image on the Organelle, I would think they are very similar.
Please let me know any errors you get through the test script!
And anyone with a working setup please let me know, too :) I’ll try my best to make this work in most cases.
thanks for the script. this is the error I’m getting:
Traceback (most recent call last):
File “test_webcam.py”, line 10, in
camera.start()
SystemError: ioctl(VIDIOC_S_FMT) failure: no supported formats
so I guess my camera isn’t compatible. is there another way?
looks like the main issue stems from ‘v4l2’ using YUYV pixel format. all the cameras I have are MJPG.
This effect sounds really cool! I’ve tried using a couple webcams, one pretty old one that didn’t get recognized as a webcam. Then I tried a pretty new one and it was recognized, but no picture came through, just a solid background color. I don’t have a Linux system set up that I can test with currently, unfortunately, so I’m not sure if the webcam is compatible. I was wondering if you could link which webcam you use and have success with?
Just as a follow up, the newer webcam I used that I didn’t have luck with is a ZGear Connect 2K Video Quality Pro HD Resolution Webcam. The webcam box says it uses usb 2.0, and should be plug and play. The mode recognized it as a webcam, since it didn’t show the little lego guy, but no picture came through, just the background color. If you could post the webcam you used successfully, that could be a big help with getting this mode working :-)!
Tried with the KANO Web Cam, a Zoom Q8 in WebCam Mode, and an unbranded Amazon Web Cam… they all did not display the Lego Guy… and they all did not display Video either.
I was really hoping the KANO WebCam would work as it’s Low Res (640px480p) by default and seems like the cute companion for the Eyesy also seems to be made for Raspberry Pi compatibility (https://kano.me/us/store/products/webcam).
Attachment Kano-Webcam.png
So I was able to get the KANO-Webcam mentioned above working by editing line 25 of the main.py to specifically mention the lowest supported resolution and device name…
CAM = ((pygame.camera.list_cameras() or [“”])[0], (160, 120)) ##Problem
CAM = ((pygame.camera.list_cameras() or [“USB camera”])[0], (320, 240))
Just in case anybody else happens to have this particular WebCam. If not see the above link to the manufacturer or give Jeff Bezos yer money… https://www.amazon.com/dp/B08KSBSZTG/ref=cm_sw_em_r_mt_dp_VT82F4P2FFR727X22WCF
I was also able to get the Kano webcam working. One slight change on my end to get it to work:
For some reason adding “USB camera” caused it to not display. leaving it as an empty string with updated lowest supported resolution resolved it. So for me,
CAM = ((pygame.camera.list_cameras() or [“USB camera”])[0], (320, 240))##Problem
CAM = ((pygame.camera.list_cameras() or [“”])[0], (320, 240))
The camera footage shows up, and the size of the video display shrinks and grows rapidly. To stop that, usually I have to press the trigger to get it to stop resizing itself, then it works fine!
Glad you got the USB Webcam working as well. Are you using the Eyesy hardware as well? I know velolala the Author of the Patch was testing using the Organelle M running on Eyesy 2.1 firmware… I can confirm that I was using an Eyesy on Firmware 2.1 (Imagine the Kano webcam will also work on the Organelle M with the same edits though.)
I think that rapid resizing is an intended trigger effect of the patch/mode as written by velolala. The mode can be edited not to have that effect, I find the re-sizing interesting with persistence turned on and some movement of the knobs automated.
I had the webcam running on the Eyesy for upwards of 4 hours ;) with no issues… so if there’s a doubter out there you can be at rest. Still untested is whether multiple webcam modes can be loaded on the Eyesy… I can easily envision this being an issue for resource availability.
Hopefully there are other working webcams that will be tested, shared, and linked here. The Kano webcam is super cute and ultra cool but currently, not the most economic or long term market availability solution. (kinda wish C&G had an official C&G endorsed webcam to integrate into the Eyesy product environment. Sorta like how the WiFi nub on the Organelle became a basic piece of the OS and development environment… and then C&G sourced and sold a cost effective add on item for the whole club of users.)
just got my Kano cam working too! I am using Eyesy hardware OS 2.1.
I needed to have “USB camera” on line 25, it didn’t work without it.
I might try to tweak the rotation and scaling too, seems to be only moving through the left 2/3’s of the space.
hey so i’ve never coded before. would it be possible for one of you guys to attach a zipped version of the mode with the modified main.py file if its not too much trouble? i’ve tried a bunch of times and can’t get it to work. appreciate any help.
Are you using the Kano webcam? Have you tried making the changes to line 25 like the commenters described above? Those are all the changes that I needed to get mine working, so hopefully you’ll have success following those steps. I’ll post the code that lets mine work, but as described above, different code appears to be needed to get it working on a device by device basis. So, my code may not work for you.
Here’s the zip of my webcam file. if it doesnt work, try adding “USB Camera” to line 25
Attachment U-Webcam-1.zip
I’ve tried both DudeTheDev and insektgod’s methods with a fresh flash of 2.2 and only get the lego png. I’m using the Kano webcam and Eyesy as opposed to Organelle.
Hey guys. So I really hope I get the EYESY because it looks beautiful and the price point. But it hinges on if I am able to do this with it, otherwise I have to go with another video synth. So from what I understand I should grab a Kano webcam for best guaranteed results? Do I need an organelle M as well or am I fine with an eyesy as long as I edit line 25 was it? Also.. is there a specific program to edit in? And lastly and most importantly are there any actual example videos of this patch in use? Any help is welcome and appreciated. Thanks.
Does this take the webcams visual and just bounce it around or does it morph with the Audio?
Lapractica, One last thing I did to get it to work, I updated the code, and had to fully power down the eyesy. Then, swapped out the usb WiFi dongle for the camera while the device is powered off. Power it back on, and then it worked. Swapping the usb while it is powered on resulted in the eyesy freezing up. If it still doesn’t work, I fear that there may be minor differences in the hardware eyesy to eyesy, and maybe this effect isn’t 100% guaranteed to work with all eyesys and Kano webcams. It’s possible that it may work with another kind of webcam, or maybe the effects code needs changing in another way for it to recognize the webcam.
Miles, I use an eyesy with this effect, and it appears that the Kano webcam has been the most successful, but it’s not guaranteed to work as of yet, judging by the above comments.
The eyesy has 1 usb slot, and comes with a usb WiFi dongle that you use to get onto the eyesy and edit the python code, all from an internet browser, it’s detailed in the eyesy users manual. After you’ve made your code changes, you would power down the eyesy, swap the usb for the Kano webcam, and when you turn it on it should work.
I haven’t seen any videos of the effect on YouTube yet, and I don’t have a good way of recording, personally.
The effect let’s you control the rotation of the picture, and let’s you add a layer of color in the darkest part of the image, which you can control the sensitivity of. You can choose to let the image scroll towards a corner or stay still in the middle of the screen. On sound triggers (once the volume gets loud enough) the image changes color scales, and after 8 triggers, the size of the image gets randomized. It’s a pretty neat effect, but it’s finnicky. It’s using the usb slot for an unintended purpose, so unfortunately it’s not guaranteed that it will work. Maybe the difficulties people are experiencing are due to software, hopefully not the hardware, but at this point we can’t really tell.
I haven’t tried running the test script provided by velolala. I’ll give that a try this weekend and post my results if I get any, maybe it can help those having trouble
Thank you so much for the detailed response Dude
My pleasure Miles! I hope we can get these cool effects working on as many Eyesy’s as possible. Could anyone who has used the support script “Test_Webcam.py” give some steps on how to use it? Do you need an organelle to test? I’m wondering how I’m supposed to get the results of the test in my console, when the usb dongle is removed to plug in the webcam
Hey @miles,
> Does this take the webcams visual and just bounce it around or does it morph with the Audio?
In the current patch there is an effect on the EYESY audio triggers. It does not use the microphone of the webcam, if that’s what you’re after. Also the CPU of the EYESY is a but too weak to process (scale/rotate) full screen webcam images. I hope to experiment with OpenGL some time soon to see if that can be improved.
@Everyone else: thanks for your active participation! It’s great to see my patch hit a nerve 😊 I finally have an actual EYESY (instead of an OrganelleM) and will try all the modifications you sent in order to get my patch to work more stable with different Webcams.
Is there a way to make the video resolution larger?
can’t get this working with Eyesy os 2.3. I have the kano webcam
using @dudethedev’s script on april 19
but now getting this error
“””
ETC Mode that streams a webcam.
Controls:
– Knob 1 and Knob 4 control the colors for masking/keying the picture
– Knob 2 rotates the picture
– Knob 3 controls the movement speed
– Knob 5 controls the background color (default)
– Trigger manipulate the webcam hue and add some wiggling
If camera can not be found or not be opened, it will use a static png instead.
Tested: EYESY OS 2.1 on Organelle M
# Copyright notice:
# This mode is dedicated to the public domain: [CC0](https://creativecommons.org/publicdomain/zero/1.0/deed.en)
“””
import pygame
import pygame.camera
import time
import random
import subprocess
pygame.camera.init()
CAM = ((pygame.camera.list_cameras() or [“USB camera”])[0], (320, 240))#320,240
BLACK = pygame.Color(0, 0, 0)
X_CENTER = 1280 / 2
Y_CENTER = 720 / 2
def setup(screen, etc):
etc.capture = Capture(etc)
def draw(screen, etc):
etc.color_picker_bg(etc.knob5)
etc.capture.get_and_flip(screen, etc)
class Capture(object):
def __init__(self, etc):
self.size = CAM[1]
self.cam = pygame.camera.Camera(*CAM)
self.static = None
self.snapshot = pygame.surface.Surface(self.size, 0)
self.masked = pygame.surface.Surface(self.size, 0)
self.out = pygame.surface.Surface((etc.xres, etc.yres), 0)
self.out.set_colorkey(BLACK)
# Count triggers
self.triggers = 0
self.scale = (etc.xres / self.size[0]) * 0.3 # full size uses too much cpu
self.threshold = pygame.Color(0, 0, 0, 255)
self.thr_color = pygame.Color(0, 0, 0, 255)
# Start camera
try:
self.cam.start()
self.static = None
except SystemError:
# camera not available
self.static = pygame.image.load(etc.mode_root + ‘no_camera.png’)
self.static.convert()
self.triggers += 1
def get_and_flip(self, screen, etc):
if self.cam.query_image():
self.snapshot = self.cam.get_image(self.snapshot)
elif self.static is not None:
self.snapshot = self.static.copy()
self.threshold.hsla = ((etc.knob4 * 360, 50, 50))
self.thr_color.hsla = (360, etc.knob1 * 100, 50)
pygame.transform.threshold(
self.masked,
self.snapshot,
self.thr_color,
self.threshold,
etc.bg_color,
2,
)
if etc.audio_trig or etc.midi_note_new:
direction = 1
self.triggers += 1
self.triggers %= 8
if random.random() > .5:
hue = random.randint(-127, 127)
if self.static is None:
subprocess.Popen(
“v4l2-ctl -d {} –set-ctrl=hue={}”.format(CAM[0], hue).split()
)
else:
direction = -1
if self.triggers == 0:
self.scale = (etc.xres / self.size[0]) * random.randint(12, 45) / 100.
rotation = etc.knob2 * 360 * direction
#self.out.fill(etc.bg_color)
self.out.blit(
pygame.transform.rotozoom(
self.masked,
rotation,
self.scale
),
(0, 0)
)
factor = -1 if self.triggers < 4 else 1
_time = int(time.time() * (100 * etc.knob3)) # speed of movement
X = (etc.xres / 2) – self.size[0] * self.scale * .5 – (_time % (etc.xres / 2)) * factor
Y = (etc.yres / 2) – self.size[1] * self.scale * .5 – (_time % (etc.yres / 2)) * factor
screen.blit(self.out, (X, Y))