I won't have a post up this week but here are a couple sneak peaks at my RendAR promo video 🎬 and poster for the virtual Capstone Design Symposium next week at the University of Waterloo..
Post 1: Introducing RendAR
Post 2: The Turntable
Post 3: Capture Rig Lighting
Post 4: Motor Control
Post 5: System Integration & Bluetooth
In my last post I said I'd either talk about camera integration or load cell calibration next. Well hold onto your hat because they're both working now. Jumping right in, here's a demo capturing my running shoe. The app UI is shown on the left. (Note: this demo doesn't include image enhancement / background removal.)
I designed the iPhone app user interface (UI) in Figma and then built it in Xcode. The UI guides the user through a sequence of steps:
For those who care about software implementation, the app is built using the standard model-view-controller (MVC) design pattern. In step 1, a BLEManager class is initialized. The BLEManager is passed from view to view to maintain the Bluetooth connection. In step 2, a Product class is initialized. The Product object is passed from one view to the next to populate the data fields, including name, mass, and images. Eventually the product object will be sent to the cloud for image processing (enhancement + background removal) and then to the retailer's Shopify store.
Side note: Designing the UI was fun! Back in the day, I was a bit of an art kid so it's nice to remind myself of that. In the spirit of shameless self-promotion, here's some evidence of my old art skills. I won my high school art contest without entering for the painting and you better believe I'm still smug about it. Thanks to my grade 12 art teacher, Mrs. Skol, wherever you are.
The mass measurement and image capture happen in step 5. The iPhone and capture rig talk to each other over Bluetooth to synchronize their actions. This state diagram shows the states (blue), events (green), and conditions (black) that dictate when things happen:
Load Cell Calibration
The load cells were a pain in the butt to get working but I only have myself to blame. The capture rig has four load cells and I tried to calibrate them together. That didn't work very well — mass readings were inaccurate, noisy, and unreliable. I did the logical thing and avoided calibrating the load cells again until right before a demo with my professors. This timing wasn't totally by choice. I physically broke my microcontroller around this time and had to wait a few days for a new one to arrive 😬. Fortunately, once I sucked it up, disassembled the turntable, and calibrated the load cells individually, they worked great. Lesson learned -- procrastination works, kids 😉. Just kidding, don't recommend.
I also discovered that one of the load cell amplifiers is a dud. When a load cell supports a mass, it bends ever so slightly. This deflection changes the length of strain gauges within the load cell, which changes the resistance in the circuit, which changes the voltage signal read by the microcontroller. The voltage changes are so small that they must be amplified to be meaningful.
I removed the faulty amplifier and load cell leaving the capture rig with three functioning load cells. Not ideal in terms of balance, but it works. Here's a demo showing the accuracy of the load cells.
You may have noticed a couple of "Coming soon!" fields in the Capture Summary screen. Those fields are Type and Dimensions. I'm working on measuring product dimensions in the app using ARKit. I'm also taking a machine intelligence course and piggybacking my course project on my FYDP. I'll be training a model to populate the product type field.
Up next: Probably measuring dimensions with ARKit 📏
It's time to put the pieces together! So far I've built separate parts of the RendAR system like the turntable, softbox, and motor control loop. Now I'm putting them together physically and with code.
First, I attached the lid to the turntable housing using a piano hinge and lid stays (things that hold the lid up). I wasn't sure how well the lid stays would work, but they turned out perfectly. They have enough friction to hold the lid vertical while being easy to open and close.
Next, I wired everything together and ran a few tests to check the connections. Everything worked as expected...except for the button. Yes, the button — the simplest part of the whole thing. Eventually I realized I needed to configure the button pin on the Arduino with a pull-up resistor and then it was peachy.
Why is there a button? I asked myself that question when it wasn't working, but there's a good reason for it. When I researched competitor products, I read reviews for a bunch of photography and 3D scanning apps. One complaint users had was that they'd line up their shot, press the shutter button in the app, accidentally move their phone, and have to start over. A physical button (with a pull-up resistor...important detail) prevents this problem. Once your camera is lined up, you don't have to touch it again.
Here's how the rig looks! I'm debating whether to paint the turntable disc and lid to match my initial design but that's a future-Laura problem. For now I'm enjoying the Scandinavian aesthetic. The phone mount isn't shown here, but it attaches through the hole in the front.
Bluetooth is a type of wireless communication that lets devices talk to each other across short distances. Data is sent between devices via radio waves oscillating at 2.4 GHz. That's 2.4 billion waves per second! In RendAR, I'm using Bluetooth Low Energy (BLE) to connect the capture rig and iPhone app, which uses less power than traditional bluetooth.
You can think of BLE like the community bulletin board at your local coffee shop. In BLE jargon, the bulletin board is a peripheral device that shares information. Each flyer on the bulletin board is a service and each service has characteristics. For example, the bulletin board (peripheral) might have a concert flyer (service) listing date and ticket price (characteristics). As an observer of the bulletin board, you are a central device. You look at the bulletin board and read the information you care about.
For RendAR, the capture rig is the peripheral device. It has a capture service with state and mass characteristics. The iPhone is the central device and reads data from the capture rig peripheral. For example, the capture rig will update the state characteristic to say the turntable finished rotating. The iPhone reads the state and knows it's safe to take a picture. The iPhone can also update the state characteristic after taking the photo, letting the turntable know it can rotate again. This is how the turntable and camera synchronize.
BLE Communication Test
I made a bare bones iOS app to test bluetooth communication between the capture rig and iPhone. I'm using the ArduinoBLE library on the capture rig side and the Core Bluetooth framework on the iPhone side. Here's a split screen demo with the app shown on the left. In the demo, I'm pressing the Photo Trigger button, which is a placeholder for the actual camera shutter.
Disclaimer: I don't actually have incredibly light running shoes, I just haven't (successfully) calibrated the load cells yet.
Bluetooth is named after King Harald "Bluetooth" Gormsson, who united Denmark and Norway in 958. He was known for having a discoloured tooth and will be for all eternity. Jim Kardach, an engineer at Intel in 1997, was reading a book about King Gormsson and liked the name since bluetooth unites devices. The bluetooth icon is composed of the Nordic symbols for H and B in honour of King Harald.
Up next: Either load cell calibration ⚖️ or camera integration 📸...it'll be surprise for both of us which one works first.