Week 10

I spent the beginning of this week continuing to train and test neural networks, which can be seen below; binary map plots were made for thresholds of 0.5 and 0.2 respectively, and the differences in the neural networks compared to their immediate predecessor have been made bold. I spent the latter part of the week writing my final report for the summer which can be found on the homepage of this website. It doesn’t include any of the interesting analysis work that I’ve done that could be used by the lab in future publications but provides an overview of the development of the neural networks.

Read More

Week 9

I spent the majority of my time this week debugging, doing more analysis on my small swarm, and continuing to try to improve my neural networks using the small swarm as training data (shown below).

Read More

Week 8

I started off this week by training neural networks using my labeled small swarm for training data. This process took some time to start due to the large amounts of data constantly causing my notebook to crash, but I was eventually able to modify everything to allow for this increased data size. I trained various networks throughout the rest of the week, and the specifications of the networks and the results can be seen listed below:

Read More

Week 7

I spent the vast majority of this week labeling bees, and I was finally able to finish labeling all of the identifiable bees in the small swarm, ending up with 425 seperate bees. image

Read More

Week 6

One mini research question that came about in discussion with Danielle this week was how much time could be saved creating training data if using a preliminary neural network’s output to assist with labeling. Currently, I am spending hours labeling bees to have more data to train the network on, but I also have a preliminary network that is capable of identifying bees fairly well. Therefore, it would theoretically be possible to feed new data into the network and use the output as a foundation for labeling; instead of starting from scratch, I would just have to clean up the data and fill in any gaps. We think this process may greatly increase the rate at which we could produce training data, therefore helping the network learn better, faster. Though this wouldn’t work for the swarm I’m currently working on, as I’ve come too far to be helped by the network, I’ve begun to look a little into how long the labeling process is taking to assess how much time could be saved.

Read More

Week 5

At this point in time, based solely on test loss, the best performing model was modification 1, trained on 27 crops (referred to as “modification 1.27” from now on). Though this model has the lowest test loss at 0.148, I think it is possible that some of the other models may actually perform better and may just be picking up bees that I didn’t identify in my labels, leading to a higher loss. Therefore, I’m not going to fully commit to any network yet; however, when looking at various interesting possibilities for the network, this modification 1.27 network is going to be where I start by default.

Read More

Week 4

I started this week with a meeting with Danielle to talk about what I was able to do last week and some interesting next steps I could take. The rest of the week I spent (1) continuing to make changes to the neural network, (2) looking more in-depth into how well the networks were performing, (3) attempting to use my networks on the small swarm, and (4) labeling bees.

Read More

Week 3

Throughout this week I continued labelling bees in the larger swarm to use for training data. I also started the week with a one-on-one meeting with Danielle to discuss my goals for this summer in more depth. Then, I spent the majority of my time making modifications to the lab’s preliminary neural network. Since the starting architecture was fairly simple, there were lots of things I could do to change it and see how I could improve it, so I spent the rest of the week making small modifications to it and seeing how they impacted the output. Below I summarize what changes I made and the key observations of each change. Note that, at this point, the neural network has only a single cube of bees to train on and no validation data, so the model is easily at risk of overfitting. Additionally, all models were trained for 1000 epochs.

Read More

Week 2

Throughout this week I worked on labelling bees in a larger swarm to use for training data. Meanwhile, I performed some preliminary analysis using the bees I labeled last week, in order to better understand the data…

Read More

Week 1

I started off this week by meeting Danielle Chase, a postdoctoral fellow working in Dr. Peleg’s lab, who I will be working directly under. She introduced me to the project and went over the basics of what we will be doing this semester. Following that initial conversation, I read two papers (“Collective Mechanical Adaptation of Honeybee Swarms” and “Instance Segmentation of Densely Packed Cells Using a Hybrid Model of U-Net and Mask R-CNN”) to help familiarize myself with the lab’s previous work as well as some methodology we are hoping to use in the near future. I then spent the majority of the week labeling individual bees in 3D images in order to use for training data (starting off with a cube containing about 50 bees); I also began looking into the basics of topological data analysis and visualizing data using Napari.

Read More