<< return to Vizycam.com

Modifying the Birdfeeder app

I don’t have a bird feeder set up yet. So I played around with version 0.0.84 code this afternoon.

I copied the birdfeeder folder and contents to a new folder called coco_watch and changed main.py and coco_watch_consts.py so it uses the coco tensor flow data found here: (https://github.com/charmedlabs/kritter/tree/main/src/kritter/tf/coco).

After a reboot (not sure I needed to?) the Vizy Apps/examples menu shows my new app and could start it for me. It seems to work pretty well and accurately IDs people, chairs, books, cars, cell phones, etc. It does get a few false positives (palm tree is a zebra?) but in general, seems pretty solid.

Fun!!

Hi Ed,
Cool – that’s our hope is that users will do exactly as you’ve done and put together their own applications using the existing apps/examples as a starting point. And I like the cars parked across the street labeled as suspicious! You can kinda imagine why it thinks the palm is a zebra. (Did it correctly identify any real zebras in your front yard?)
(Note, Vizy will refresh the apps/examples each time you bring up the dialog, so you don’t need to reboot, but it doesn’t hurt either.)

Edward

I was just playing around with things to see how they work. But I’m thinking it could be a security camera app with some more code. And according to Hofstadter’s law, it shouldn’t take very long. :slightly_smiling_face:

And nope - still waiting for the real zebras to show up!

Nice to see a different tensorflow model works, but we’ll need to go further than this of course -

  • Train our own tensorflow ( which I’ve been trying to do … struggling so far )
  • Build and test our own application
  • Ue the vizy UI to download the new application and run

I agree.

  • I’m a complete newby with Tensor Flow - so progress on that for me will be slow. I thought substituting a different model was a good first baby step and it works well enough to encourage me to do some research.

  • It seems pretty easy to build / test a Vizy application - for the Python portions anyway. My previous work with Python on Raspberry Pi is helpful.

  • The system that Charmed Labs has put together worked well for me. The UI found the new app and ran it with no issues. I do need to figure out how to modify / program the app UIs.

Hello Guys,
I also found the Coco model pretty useful and flexible…accurate and robust enough for my scope, so I also imported coco instead of birdfeeder in the app.
The open point for me is now understanding the app code to define a new list of target and pests.
I went trough the code but unfortunately w/o successful and clear result. Could you please help me modify app and constants file to match my scope? Could you please help me understanding the coded logic triggering the reaction to target “bird” or pest?
Thanks
Armando

I’m not an expert, but I believe the COCO model is limited to the objects that it’s been pre-trained to detect / classify. To change which objects are recognized you’d have to train a new model, or find an existing one that meets your needs.

You can implement code similar to the bird feeder app to control which COCO recognized objects are reported.

Hope this helps.

Coco detection pattern is pretty wide, so fine for my scope, no need of further training. The point is simply how to tell to the app which one of the detected object is not relevant, which one is a target (let suppose, equivalent to a bird) and which one is a pest ( equivalent to squirrel, cat etc).

The bird feeder app does what I think you’re asking. If you start there, modify it to load COCO, and implement a different constants file it should work. Look at this file to see what the COCO objects / IDs are: https://github.com/charmedlabs/kritter/blob/main/src/kritter/tf/coco/labelmap.pbtxt

And this file is what I’ve been using: https://www.dropbox.com/s/ywg9dawkzjrwu6p/CocoWatch_consts.py?dl=0

Thanks Edro!
The label map file was exactly the missing info for me!
Armando

1 Like