Bees & Beer UPDATE

We met with Leslie on 3/21 to discuss synthesizing yeast that produces Hops beta acids. Given previous research, this seems very possible.

As we understand it, the process of making humulone is as follows:

Isovaleryl-coA (byproduct of Leucine metabolism) + 3 malonyl coA  (present in yeast)

—>

PIVP (3-methyl-1…) + 2 DMAPP [part of Mevalonate pathway]

—>

Humulone (Bitter Acids)

It seems that researchers, for another reason, have created yeast that produce PIVP.

Our project will focus on the relationship between bees and yeast. Our ideal Beer of the Future creates a self-sustaining ecosystem that is beneficial to both bees and yeast.

How Bees and Yeast can create a circular economy:

  1. Our synthesized yeast creates hops beta acids that kill varroa mites (a major factor in colony collapse disorder).These acids are extracted to protect the hives.
  2. The yeast feeds off the honey produced by the bee hives.
    This fermented liquid, flavored with both honey and the hops acids inherent in the yeast, is sold for human consumption. This provides a steady flow of funding for the colony.
  3. The “used” yeast is fed to the bees as a pollen supplement. This is already a common practice in beekeeping.

March 10 – Begin brewing beer
March 25 – Add honey to beer
                    Begin brewing mead
March 29 – Confirming w/ Leslie re Yeast
                     Conceptual Flow & Design [how will this work?]
                     Brew
April 5 – Design
               Beer is ready!!
April 12 – Build with Chester/Kadallah
                  Design Assets
April 19 – Brew
April 26 – present

How much honey does yeast need?

Magic Beer Windows [final]

In drastic departure from my my original intention, I decided to resurface an old idea and use AR to enhance it: understanding Beer. For my programming design final last year, I tried to create a visual system for understanding certain metrics (bitterness, color, category) of beer.

Beer is both delicious and surprisingly complicated, as is any fermented product since the final outcome can only be controlled to a point. Beer is also a luxury item which means people are often willing to take the time to understand how it tastes and what goes into the process of making it. This is important in designing an AR experience as people do have to go out of their way to download an app and bring out their phones to use it.

While I know using AR to display information about a product, particularly food-related, isn’t incredibly novel, I think beer is suited for this use since people often take the time to really savor and understand what they are consuming and there is no room on a beer label for such info. [I’m generally for increased knowledge about whatever it is you are consuming] I also think it can be more informal (and less complex) than wine, which is why I started here. While the final flavor of beer results from a number of factors, I think being able to view some of the ingredients and identify categories over time would help create a more discerning palate. It’s like learning the language of beer organically: experiential learning.

At first I had wanted to use some of my previous designs as the image targets for beer, but I realized that most beer bottles naturally have their own perfect image targets: labels.

 

I started with Brooklyn Brewery beers since their labels are well contained and they also have detailed documentation on their products. I also relied on Under The Label for more details. Ideally, it would be cool to make an API that this information could be pulled from and to which all breweries would contribute.

I was next trying to figure out what information to display. Some ideas I was playing with: history, ingredients, hops, malt, yeast, ecological footprint, location, taste. Ultimately I decided on a mixture of these, but I do think that this is an area I could play with a bit, but ended up taking a back seat to some of the purely technical issues and development.

Seeing as I wanted to display multiple sets of info, I decided to dive into Vuforia Virtual Buttons, the idea being that obscuring notable parts of the image targets can cue an event. I thought it might be nice to have a “button” on the physical object since you would be holding the beer in your hand, most likely, so then you would just have to move your fingers around on the one hand since the other hand would have the phone. I got this to work pretty well with my hand.

Some of the technical issues revolved around displaying multiple ‘screens’ of info. I suppose I was having a button code logic issue. I could get the flow of the screens down in one direction, but not the other (ie. You can toggle form hops to food cleanly, but not the other way around).


Ultimately, where I got to can be seen below, with the icons serving as virtual buttons. I also added sounds that trigger with each page, just for some more user feedback.
Bottle opening with app:

Food cue:

Hops cue:

In the future, I would definitely take the notes from Rui (below) to make the augmented images more stable and easier to trigger. The buttons also need some work, since they can be a bit fickle. I would also play with some of the content and design.

Notes from Rui: Use cylindrical image target, add autofocus into code for app

Tango Tango

This week, I worked with Gal to get Tango up and running.

I must say, this week was far more successful, which I think had to do with the fact that Tango comes as a nice package and we have already been through the updating/software struggles.

Gal and I initially wanted to see what dropping movies into an AR setting would look like, but it turns out video playback on mobile devices is not simple or cheap (expensive plugins). We liked the idea of using depth kit video to place walking talking people in the environment that had an apparition-like quality to them.

After scraping the video idea, we turned our attention to animated avatars. Gal had a fully formed avatar of herself from a previous project, so we wanted to see what that image would look like in AR. After rigging her body in Mixamo, we imported the obj/mesh/image files in as as the body and some pre-made dance animations from Mixamo. We followed these guidelines to build the animated character in Unity.

To quickly scan my body, we used Structure Sensor, downloaded the .obj, uploaded the files to ReMake by Autodesk to re-orient the body and compile to an .fbx, uploaded to Mixamo, rigged and went from there.

We played with several iterations and liked the effect of these bodies dancing around in a circle, to add some life to a dull day or room. As an homage to one of the earliest internet memes, we added some tunes. 

FINAL PROPOSAL

For my final project, I am interested in augmenting the human form. I find it interesting how bodies become unintentional actors in AR experiences, how viewing the ‘real world’ through a screen removes you from it. With my final idea, the main thing I want to address in the relationship between the person/body in front of and behind the screen.

Plan A. I know using a whole body as an object target is kind of tough, but I am really curious to see if I can get Vuforia to recognize parts of the body, like the face or a hand. I like the idea of playing with mental perception by turning a person’s hand into a non-human appendage, like a crab’s claw or even “augmenting” the human form: What would it be like to have 10 fingers?

Plan B. A wearable with multiple object targets that change depending on the shape of the body. I thought this could be a kind of game between two people. Perhaps the augmentation could be in words and the other person’s movement style forms a poem for the person behind the screen. Or a song, if audio was triggered.

I am going to start with plan A and move onto plan B if the technical side is not working out.

 

Biodesign3: Synthetically Speaking

I really enjoyed listening to Dr. Mitchell speak last week and I left feeling fixated on the idea of this entirely synthetically created organism. Close up, I understand how it works and the interest in pursuing such an endeavor, but from afar, it’s crazy! A la the Center for Genomic Gastronomy, I would love to get my hands on some of those early synthetic strains to brew some man-made beer. Even though Dr. Mitchell attests it would taste horrible, I think it’s interesting for 2 reasons: A. Being able to taste the difference between nature and man-made constructions and B. Beer is one of the oldest drinks on the planet and one of our first “domesticated” organisms, so it’s would be fascinating to get to a point where we have elimated the need for the naturally occuring species that we’ve relied on for so many years….even if that need was just to get drunk. It speculates on what the future of our relationship to the natural world is in an era of synthetically created organisms.

Re the Biodesign challenge, if this idea doesn’t qualify, I would love to talk to other people in the class to bounce ideas around and see if I could be part of a team!

 

AR + APIs: Experiments in Failing

 

This week was not good.

I spent about 24 hours just fighting operating systems and software updates only to not get my AR experience onto the iPhone. I was getting a ‘product_name’ error.

At that point, I decided to just try to modify your code to try and put my own image target there, which worked.

From there, I built my AR experience from scratch again (despite numerous Unity issues) and tried to modify this code to get some interaction with the weather API. However, I felt like my Unity scripts were running into recognition issues — it seemed Unity wasn’t recognizing my Game Objects.

All in all, I felt like I fought inane details as opposed to learning how to actually call API’s in Unity.

 

Microbiomes

A microbiome and ‘good’ bacteria is not a novel concept for me. Much of my undergrad was spent in the microbiology department of a university that had a flourishing (pun fully intended) reputation relating to probiotics research. I remember after one advanced microbiology class, I asked my mom for some obscure probiotics from Japan that my professor had religiously espoused. They were prohibitively expensive, especially considering any benefits of probiotics can only manifest after consistent, and long-term use, but nonetheless, Santa delivered that year.

I say that just as a preface to the idea that sometimes science is really bad at understanding things as a whole, particularly on an intuitive level, since they spend so much of their time dissecting the micro and miniscule. Much of biology is replicated in a test tube or in an organism that is not only dissimilar in body, but also in environment. I’ve only come to understand this after taking a step back and better understanding the connection between the physical and mental self. As such, I’m immediately skeptical of any one piece of the puzzle that is titled ‘The Next Big Thing,’ however interesting it may be up close. At the risk of sounding like Avatar James Cameron, our cognitive experience is shaped by the constantly evolving world around and inside of us in a constant feedback loop. It is this extremely complicated and delicate interconnection which I find infinitely fascinating.
On that note, Check this amazing project out!

I also found this page which give greater insight into legalities and myths around GMOs.

BioDesign Ideas/Inspo

*I am interested in buying the “Time Lapse Explorer” kit from uBiome company and somehow using that data and my own body as an experiment.
*Analyzing microbial clouds around certain meals. What ‘clouds’ produce better tastes? Could I create “Cloud Bombs?” re this article
*I f*cking love human tear salt. LOVE.
*Cavity fighting Microbe! old but worth investigating
*Skin Care
*Amazing Resource
  – ethylene sensing bacteria
interesting, but terrifying??

AR in Unity

This week, Ty and I started working together.

We had grand plans as far ideas to execute, but realized we were both starting at square 1.3.  Since we both had the basics set up for the Image Target AR in Unity running, we decided to get an Android phone and try object scanning with the Vuforia app.

As it turns out, scanning is a bit more of a pain than we had envisioned. I think most of these pains could have been smoothed over with a proper setup, including proper lighting, rotating dolly, and a gray background. We did get augmentation with the test in the app with a Sesame Oil bottle, which had many differentiated colors on the label.

I took this home to work on and pretty much followed the tutorials by the book, but was having problems.  I have a feeling it is stemming from poor scanning.

Unfortunately, ITP was closed all day Thursday due to weather, so I couldn’t play around with the Android to try to re-scan, so I went back to Image Target recognition. I got that up and running fairly quickly, so moved on to getting the app to run on iOS.

I’ll say now that I’m still in the process, but wanted to post this update. I realize now that I need to update my MAC OS in order to update my XCode in order to get XCode to recognize my iPhone since I’m running the latest OS. I understand most of the steps, but updating my PC OS isn’t trivial…it’s a bit old.

Regardless, the process of sifting through the tedious details and multiple attempts has increased my comfort with Vuforia and Unity.
** Question to ask Rui ** — What is the proper organization of folders and scenes re Unity projects? Where should i keep new packages, assets, etc? Adding new assets in the middle of a project?

While I continue to work on this (goals: 3D object recognition, iOS operation, advance scripts), here is a little bit of the AR I set up in Unity.

Week 1&2:

As I read A Life of Its Own by Michael Spector and he was introducing us to the idea of bacterially-synthesized drugs to treat malaria, my mind immediately went to the problem of scale. To his credit, he quickly discussed this issue in the article, but it is still something that lurks in my mind with so many biotech and food issues: growing uncontaminated cultures en masse is not easy (or cheap) and urban farming isn’t going to get us anywhere. The Farmbot is really fantastic and I like how it uses a computer interface to design your garden – that’s such a smart way to integrate learning using current modes of interaction, particularly in our ‘food production dissociated’ society. However, I’m stuck on scale. What does this look like in a larger setup? While impressively condensed as is, it’s still somewhat impractical for many people (IKEA assembly is a real world problem) and areas . Are these types of inventions useful as a transition or too myopic in scope to live on? Where do we start with problems of this scale?

Also, here is the compost machine I was referencing.

Think of the Alternate/Alternative Now – what does it look like?  Taste like?  Smell like?  Feel like?
What Questions do you want to see answered or problems you’d like to see addressed?
* Issues of scale
* Will we all be farmers in the future?
* Will we have time to farm in the future?
* What’s the deal with drought resistant plants? Are they useful, beneficial?
* What is the legality of gene construction in agriculture?
* Can a monocrop be modified to actually replenish soil?

`
What do you wish you could make?  Think of some of Sebastian’s amazingly whimsical projects.
*
A GMO to prevent cavities 
*
A symbiotic monocrop
* Alternate food packaging
* This
Harness power of microbes to safely dispose of waste or by-products
* Specifically designing a crop capable of withstanding conditions in Madagascar
*
Find inspiration HERE

What science/biology stuff do you want to play with and why?
*
Genetic manipulation (I have experience, though not with plants)

 

one: scraping Hammacher Schlemmer

Assignment 1: Write a simple web scraper that grabs text from a page or series of pages and saves it to a file.

Since we are using Beautiful Soup to scrape sites, hence precluding those using .js, and since we are just amalgamating the results to a list, I thought the best sources would be retail sites.

While Mortuary Mall was intriguing just out of personal interest, I decided to go with Hammacher Schlemmer, because nothing says absurdist capitalism like a company that sells a machine to make “The perfect Ice Cube” for $800. One. Ice. Cube….. Going for the most bang for my buck, I scraped from “The Unexpected” section.

The code is super simple and Beautiful Soup worked…beautifully. I had some annoying formatting issues and I probably dealt with those inefficiently, but I did get the desired output. However, when I went to send the results to a text file, I was getting the following error, which I understand what it’s getting at, but need to ask you how to deal with that in this specific program: ‘ascii’ codec can’t encode character u’\u2019′ in position 999: ordinal not in range(128)

Maybe we can petition for The Productivity Boosting Nap Pod. ($16,000.00) at ITP?