This may be a coincidence but about a week after I got new motorcycle gloves, my Oxford Heater grips started like, deteriorating away. Check this out:
It has been hot recently, but this is still very odd. I got these Street & Steel V-74 gloves from RevZilla. Maybe it’s the Gel Palm leaking out and reacting with the grips? Who knows. Anyway it’s sticky and gross and annoying. Poor Oxford Heaters, I love those things.
I moved to a new place and it has more than one room. Naturally, I hooked up the stereo in the living room and tested it like my dad taught me: by playing “Money For Nothing” really loudly. It worked. But wait a minute, there’s an upstairs now… how will I get it playing up there? I could always use the wifi network and raspberry pis to beam audio around. Yeah, let’s do that!
One of my first memories is a vision of lying near my dad in the basement in the mid-1980s while he endlessly soldered away at some big project. Later, I spent a lot of my childhood messing around with the product he was assembling: a Hero Jr. robot. This was a educational personal robot, intended to be your “friend, companion, and security guard.” Here he is:
Hero Jr. has a sonar, infrared motion sensor, light sensor, sound sensor, radio-frequency remote, drive motor, obstruction sensor, and a RS-232 serial port. His out-of-the-box features included a security guard mode, alarm clock, poetry, singing, and (my favorite) the ability to explore around the house, often while singing America, Daisy Bell, or Little Miss Muffet.
Hero Jr. was created by the Michigan company, HeathKit, which is famous for designing and selling high-quality electronics kits since 1926. (but is now, sadly, gone UPDATE: It filed for bankruptcy in 2012 but is still operating today to some degree). As with most HeathKit products, Hero Jr. came as a kit, you had to mount and solder every component onto the circuit boards and install the motors, speakers, and sensors. It cost around $600.
His sensors are pretty solid, even a few decades later. My sister and I had a game where we’d try to sneak past him but he usually caught us. He also has a Cowboys and Robots game where you shoot him in a darkened room with a flashlight and he tells you if he got you first. Fun times. Here’s his Security Guard feature with hilarious singing when you get caught. (You could literally hook him up to a transmitter that would actually summon the police, but we never did this.)
The possibilities really get exciting with Hero Jr. Programming Language (HJPL), a simple set of instructions interfacing directly with the 99 CPU registers that you could enter, line by line, on the simple hexadecimal keypad of the robot itself. PCs were just becoming at thing at this time so my dad never experienced the BASIC option over the serial port. Below is a video of me entering and executing a simple counting sample program from the manual. It explained that we’re initializing a constant value of 0 in register 1 and then speaking the contents of register 1 and then adding a constant to the contents of register 1 and then gotoing up a line. Legend has it that my dad used to program it to walk around the house looking for a heat source and then reciting a love poem (assuming it had found my mom). Sometimes it accidentally recited the poem to the dog.
It’d be interesting to try to couple Hero Jr. with a Raspberry Pi and use it as part of a more modern personal assistant. When you get a text, it could hunt you down in your house (heat seeking) and tell you what it said. It’d still be a novelty obviously, but its great voice synthesis (driven by the Votrax SC-01A voice synth chip) really could add some charm to the uniform landscape we have today. That’ll have to be a future project.
Hero Jr. is powered by a 1 MHz Motorola 6808 CPU and can have up to 24 kB of RAM. The CPU board is below the power board, pictured below.
Hero Jr. has an add-on slot where you can hook in different ROMs to give him different capabilities. Here are a few of them:
It occurs to me that it’s been a long time since one single person knew all the details of a modern microprocessor, and exponential complexity and miniaturization seems to have left us hopeless in any desires to explore the magic of the computer/phone/camera/GPS/stereo/theater in our pockets. This goofy robot reminds me that it’s always possible to explore curiosities and fiddle around.
Given the relative sophistication of this guy, I’m honestly a bit surprised we don’t have much fancier things today. The Smart Speaker and robo-vacuum things are neat, but I think it could become much more interesting. It’s inspiring to see what was done in the 1980s.
With all the AI voice assistants around today there are lots of interesting applications people are dreaming up. Here’s another one.
You could set your voice assistant on the table and start having a discussion or debate that inevitably involves bringing up facts about news or history or how something works. A lot of times when someone doubts what was said a phone will come out to do some fast wikipediaing or other searching. If a AI could somehow either know or be triggered to check something, that’d be an interesting new dynamic to the conversation. It could do things like:
Correct misquotes and other slight error in the discussion, e.g. “Actually, the NOAA temperature data were corrected in 1950 because the volunteer network switched from morning readings to afternoon readings.”
Fill in details about a headline someone read (person: “Didn’t I read a headline about radiation dose in beagles?”, AI: “The recent UC Davis study shows a correlation between dose rate and lifespan.”)
Look up details and say them when they’d help
It’d have to be a really smart AI to know when its utterances would be useful in a dynamic conversation. It could start by just lighting up when it thinks it has something to contribute and people could allow it to chime in, rather than having it chime in only when someone wakes it. Then eventually once it’s smart enough it could chime in on its own. The future is fun.
If you have a digital video recorder (DVR) hooked up to some cameras and you want to access it remotely when something happens, you can set up remote access to review things from wherever. Here’s how to do it.
I got super excited about the prospect of helping with this and knew that with a combination of things I’ve used before it would be really doable. The plan was to have a webserver accept messages from a form and transmit them to a Raspberry Pi (cheap mini-computer), which would then flip pins on a relay to blink the light, like this:
After many emails and some ups and downs, everything worked! This really feels like how the internet is supposed to work.
Thanks entirely to the efforts of local climate-related organizations in Seattle, I’ve now spoken at a handful of book stores, breweries, universities, and even Town Hall on climate and energy. Last week, I was honored to be on one such panel at a brewery in Ballard alongside Univ. of Washington oceanographer LuAnne Thompson and Governor Inslee’s senior climate policy advisor, Reed Schuler. My role was to provide background information on the human relationship with energy: what we’ve used in the past, what we’re using today, and what our low-carbon options are moving forward. I touched on progress and challenges with intermittency, hydro, and nuclear. This post summarizes and expands upon these topics.
Energy is a replacement for the labor of human beings
The first part of my talk was easy. I threw up my favorite slides demonstrating how energy improves quality of life by replacing human labor. Between construction, farming, heating, water, laundry, and travel it’s a pretty easy case to make.
So in the continuing saga with my mom’s home-automated furnace, it got extra cold recently and I noticed it wasn’t getting up to temperature in time for her to wake up. I figured I could come up with a formula to compute the time needed to come to temperature and turn on the furnace at a dynamic time in the morning so it’d be just right.
Deep learning takes advantage of certain graphics processors (GPUs) to be efficient. If you take the course, it’s recommended that you sign up for an Amazon Web Services machine with an appropriate GPU so you can just run the provided setup scripts and be on your way learning deep learning. But you may want to try to get everything set up on your own machine if you happen to have one. I just built a small server and added a modest GPU just for this purpose so I figured I’d give it a whirl. This is how I did it.
You know how some airplanes can get their gas filled up while in the air by tankers (aerial refueling)? And how ships at sea do this too (underway replenishment)? And you know how self-driving cars and trucks are taking over everything soon? Well there’s going to be a need for mobile gas stations on the road.
Think of it! Long-haul trucking will want to go non-stop, and to do so there can be little sections of road where a tanker truck drives alongside the main truck, hooks up a hose, and refuels it for 10 minutes while everyone’s moving. Then the self-driving truck carries on and the tanker truck crosses the road, services a truck going in the other direction, and repeats until it eventually has to fill up from a bigger tank nearby.
Another manifestation is a thing on a long rail that refuels you as you drive alongside it. The hose could be on a sliding coupler that maintains a hermetic seal.
This could happen with passenger vehicles too. Presumably people will hop in their cars at night and expect to wake up in Florida the next day so they’re going to need automated gas refilling as well. Ideally this would be underway but I guess if gas stations could just fill up cars that roll in that’d be acceptable too. It will be more comfortable and less disturbing if this happens while on the road though.
That’ll be a billion dollar industry soon. If they’re electric cars, these will be charging stations instead of refueling stations.
I decided I wanted a network-attached storage (NAS) server because I needed some central and safe place to put all my big files. I’ve been using more and more hard drive space because I’ve been taking photos in RAW and collecting more digital video (camera, dashcam, digitized home videos from the 1990s, and drone). I also just enjoy fiddling with servers and stuff and thought I could use a home server for a variety of other things. My raspberry pi has been doing well for my home automation but a bigger server might make it faster. I’m trying to learn Blender and have been eyeing a Machine Learning course. Both of those require a nice modern GPU. Finally, I just enjoy learning things about computers.
On June 19th, my little sister sent me Annie Dillard’s essay about her experience viewing the 1979 total solar eclipse and stated that we were going to go see it in Oregon. She said: “This essay has made going to the Eclipse non-negotiable in my mind.” I had been moderately interested but somehow the essay made it sound way cooler that I had previously envisioned and so I got excited about it. There was already hype about how bad traffic would be down in Oregon, but she said she had been thinking about dispersed camping in Malheur National Forest. I looked at a map and it looked pretty good.
I read that it’s Goodnight Moon’s 70th birthday today. I have it on the bookshelf so I pulled it down to celebrate. Going through it after so many years led me to discover some nice hidden gems worked into the illustrations that I had never noticed before (like when I was 5). I’m sure parents everywhere notice after reading this hundreds of times, but it was fun for me to discover them.
The story takes place from 7pm to 8:10pm
The two clocks in the room are synchronized. They start at 7pm and end at 8:10pm. Each time the room is shown it’s 10 minutes later. I think everyone notices that the moon rises in each scene as well.
I’m trying to learn ways to minimize my reliance upon large companies for handling my day-to-day personal data. So I figured calendar and contacts should be on my list of things to self-host. This post is about how I migrated all my Google calendars and phone contacts to my own server without losing any features I was using. I’m doing this mostly for fun.
I got a few Amcrest Wifi security cameras for my mom’s house at her request. They’re pretty nice overall (My only complaint is that the web-interface doesn’t fully support Linux). I set one up to save a jpg snapshot to memory every minute and then flew across the country. When I wanted to access them, I couldn’t just put the SD-card in a computer or anything, and clicking all 14,000 of them seemed like a pain, so I decided to figure out how to get them with a Python script.
There are some digital levels on the market that are really nice tools to have for a variety of purposes. I grabbed a DXL360 and am really happy with it so far. When I wanted to do an angle vs. time calibration measurement of my Barn Door Startracker over 10s of minutes, I really wanted to get the data from the level into a computer so I could plot and process it a bit.
The level has a USB port but the manual suggests that an optional attachment is required to get it into a computer, at least for this model. However, the manual also states that data comes out of it in RS232 format. I bet I could read that data with some more generic equipment that I have sitting around. And it turned out to be easy. This post shows how I did it.
I like to mix hobbies, so naturally I’ve been eying astrophotography for a while. I’ve taken a time-lapse here and a moon picture there but, inspired by the folks over at /r/astrophotography, I wanted to take it to the next level. Since the Earth is spinning, any long exposure of the night sky has star trails, so you have to make your camera counter-spin if you want clear shots. In this post, you can read about how I made a simple barn door sky tracker to do this.
Barn door sky trackers have been made at home by lots of people for a long time. There are a variety of designs with different levels of complexity and precision required. I thought I’d make the simplest-to-construct one, a Haig mount. To correct he tangent error, I decided to use a cheap microcontroller (MCU) and have it speed up appropriately via software. Fun!
The math behind this is fun mostly because it’s straight out of high school and you finally at long last get to use it. Here’s the basic design:
I use SpamAssassin on my e-mail server to flag spam messages that come to my addresses. It uses a series of checks on each message and determines a Spam Score. If the Score is above a user-defined threshold, it adds a header that says that it is spam. Then dovecot files it away into a spam folder instead of my inbox. It does a pretty good job but requires tuning sometimes. I wanted to see if I could change my threshold from the default (5.0) without getting too many false positives or negatives. To do that, I’d have to collect some stats from my messages.
A year ago, my friend Laura was wishing that email providers could do some tone filtering and reject messages that are too mean. Since I run my own email server, I thought it might be simple to set something like that up easily. Turns out, it’s not that hard, but it wasn’t exactly trivial to figure out.
I run have postfix running to receive messages from the internet. It passes them through SpamAssassin, which inspects the messages and adds a few headers that indicate whether or not it’s spam. Then it passes them on to dovecot , which stores the messages in mailboxes and then tells with my email client, Thunderbird, that I’ve got mail. I like this setup because I feel like I have a bit more control over my data. Besides, it’s fun!
The original request is here shown below. I figure, if I could just have the message go through a second filter after it goes through spamassassin, I could make it a custom script that counts swear words.
I got one of those RGB LED matrix things for my birthday and wasn’t sure what to do with it. Then I found this awesome library which has Python bindings and can control it nicely even from a Raspberry Pi. Conveniently I had a spare Raspberry Pi 1 B+ sitting around so I hooked it up. After playing around for a while, I got the demos working.
Get data directly from a MQTT broker for getting live data (e.g. travel times in traffic, weather conditions) and for command and control. This allows me to connect the screen to my home-assistant home automation system.
Assemble various built-in elements like giraffes, animated text, rainbow text, pictures, animated gifs into various scenes that rotate through on the screen to display the information in various fun and/or useful ways.
There are Temperature and Duration sprites that you can define high and low values of so they’re red when they’re bad and green when they’re good, and anywhere in between.
You can set the scenes to be just random or you can control them through MQTT.
It’s intended to be very configurable but since it’s brand new some extra development is needed to make everything perfect. Send in your ideas and requests and code changes!
A relatively complete example configuration file is in the repo. That demonstrates using MQTT, connecting MQTT topics to various sprites, building your own frames of animation by hand, and adding in gifs and images from file paths. Note that you have to set an environmental variable or two to get the fonts right and whatnot.