dropofahat.zone
Open in
urlscan Pro
2606:4700:3031::6815:52cb
Public Scan
URL:
https://dropofahat.zone/
Submission: On June 25 via api from BE — Scanned from DE
Submission: On June 25 via api from BE — Scanned from DE
Form analysis
0 forms found in the DOMText Content
Stop window shopping and start window shopping! dropofahat.zone * Me (Twitter) I AM USING AI TO AUTOMATICALLY DROP HATS OUTSIDE MY WINDOW ONTO NEW YORKERS Book the midtown DropofaHat Zone now! I am a simple midwesterner living in the middle of New York City. I put my shoes on one at a time, I apologize when I bump into people on the street, and I use AI inference (https://universe.roboflow.com/test-y7opj/drop-of-a-a-ha) to drop hats on heads when they stand outside my apartment. Like anybody else. I use it myself. I have extremely high foot traffic outside my window. I see a sea of uncovered heads in the sun. I believe DropofaHat.zone will become the first of many window based stores. Here a busy New Yorker can book a 5 minute time slot, pay for a hat, stand in a spot under my window for 3 seconds, have a hat put on their head, and get on with their extremely important, extemely busy day all within a single New York minute. HOW TO USE AI TO DO DROPSHIPPING: My dream is for all the city windows to be constantly dropping things on us all the time. You will need a Raspberry Pi, an Adafruit stepper for the mechanism, some yarn, Roboflow for the AI, and a very low weight but very cool product (like Propeller Hats) 1. Just Opening the Window 2. The Hat 3. The Dropping Mechanism 4. The AI 5. The Grand Vision JUST OPENING THE WINDOW This was a challenge. My window only opens about 4 inches. If I couldn't figure this out, my entire business had no chance. There must have been some kind of key or screw I had to take off to let it open, but I saw no sign of anything other than some very tiny slots on the bottom. If I could just look up what kind of window I had, I figured I could find out what kind of lock goes with it. This turned out to be pretty confusing. I think I have a double pane awning? Maybe? Every type of window can look like many other types. I finally resorted to just googling "window keys" and going through all the images of ones that looked like they could somehow fit my window. Most looked like they needed a lock until I came across this weird shape. I was fully expecting to buy a dozen other keys but this one actually worked! THE HAT Next was deciding what hats I am going to drop and sell. My window is pretty high. It needed to be a hat that wouldn't hurt someone or fly into traffic. I decided I needed something to signify the future. Something that would look beautiful as it gracefully fell out of a window onto your head. Propeller Hats! And this one has a stylish Eagle to represent the flying it is about to do. THE DROPPING MECHANISM This was the simplest thing to get working. I had a Raspberry Pi and an stepper motor lying around so I decided to put them to work. After imagining some extra sharp blades on tiny motor somehow cutting yarn, I realized I could just wrap the yarn around the stepper motor and have it move slightly. I had a giant camera gimbal to test on and was fully prepared to stick it out the window when I realized the string could just hang over the window with this method. I literally copied this out of the Adafruit tutorial for the stepper motor. This is a single python file on the Raspberry Pi that the computer will run when the AI determines someone is standing in the right spot and ready to receive their hat. On the Raspberry Pi as "dropHat.py" import time import board import digitalio enable_pin = digitalio.DigitalInOut(board.D18) coil_A_1_pin = digitalio.DigitalInOut(board.D4) coil_A_2_pin = digitalio.DigitalInOut(board.D17) coil_B_1_pin = digitalio.DigitalInOut(board.D23) coil_B_2_pin = digitalio.DigitalInOut(board.D24) enable_pin.direction = digitalio.Direction.OUTPUT coil_A_1_pin.direction = digitalio.Direction.OUTPUT coil_A_2_pin.direction = digitalio.Direction.OUTPUT coil_B_1_pin.direction = digitalio.Direction.OUTPUT coil_B_2_pin.direction = digitalio.Direction.OUTPUT enable_pin.value = True def forward(delay, steps): i = 0 while i in range(0, steps): setStep(1, 0, 1, 0) time.sleep(delay) setStep(0, 1, 1, 0) time.sleep(delay) setStep(0, 1, 0, 1) time.sleep(delay) setStep(1, 0, 0, 1) time.sleep(delay) i += 1 def setStep(w1, w2, w3, w4): coil_A_1_pin.value = w1 coil_A_2_pin.value = w2 coil_B_1_pin.value = w3 coil_B_2_pin.value = w4 # Run a full rotation (512 steps) with a 5 millsecond delay forward(5, int(512)) THE AI I figured the AI would be the hardest part but it was surprisingly quick. I put a webcam out over the window and wanted to have inference run live on the video. I wanted the AI to literally show me what it was seeing. This would allow me to potentially live stream the feed later. And also it just seemed really cool to watch. This is what the full webcam stream looks like. I selected object detection for the initial model and then recorded a couple minutes of pedestrians walking under my window with their hatless heads. Then it was time to annotate images. I only wanted the model to tell me when someone was on the exact sidewalk square directly under my window. I put the class and prompt "person" and a lot of annotating was done for me automatically. The rest I dragged a box around the person if they were on the right part of the sidewalk or marked as null if not. You can view the annotated images here: https://universe.roboflow.com/test-y7opj/drop-of-a-a-hat/browse Obviously, your window view will look different from mine so you will have to record a few minutes and upload your own. The model seemed to work well with only 133 images annotated. I made sure there was a mix of positive and null ones. Since I only cared about a really small section the sidewalk, I added a pre-processor step of cropping the image. While this worked, I realized I would bump the webcam every now and then so I wanted a more generalized model. I removed the cropping and it worked oddly well. It only "detected" the image when it was in the spot I wanted. Even though I only tried it with one person in the annotating, it already worked on whoever walked by. Finally, I had a working model. You can view mine here: https://universe.roboflow.com/test-y7opj/drop-of-a-a-hat/model/2 Now we have to run a python program on the computer running your webcam. I want this code to do 2 things: 1. Confirm someone is standing in the correct spot for 3 seconds straight 2. Call the Raspberry Pi after the 3 seconds have passed. On your computer with the webcam, pip install the inference library and the SSH libraries I used pip3 install inference-sdk pip3 install opencv-python pip3 install paramiko This is the entire python file. You will have to put in your own API key. I want mine to stay in the free tier. It works by calling the model every second and if it confirms someone is in that spot 3 seconds in a row, it will SSH into my Raspberry Pi and run the function dropHat.py. import cv2 import time import paramiko from inference_sdk import InferenceHTTPClient CLIENT = InferenceHTTPClient( api_url="https://detect.roboflow.com", api_key="API_KEY" ) def ssh_execute(host, port, username, password, command): client = paramiko.SSHClient() client.load_system_host_keys() client.set_missing_host_key_policy(paramiko.WarningPolicy) try: client.connect(host, port=port, username=username, password=password) stdin, stdout, stderr = client.exec_command(command) # Print command output print(stdout.read().decode().strip()) if stderr.read().decode().strip(): print('Error:', stderr) finally: client.close() video = cv2.VideoCapture(0) # 0 usually refers to the webcam consec_detections = 0 while True: ret, frame = video.read() # Directly pass the frames to the Roboflow model result = CLIENT.infer(frame, model_id="drop-of-a-a-hat/2") # Check if a prediction is made if 'predictions' in result and len(result['predictions'])>0: consec_detections += 1 else: consec_detections = 0 # If there are three consecutive detections, perform an action (like printing a message) if consec_detections >= 3: # DROP THE HAT ssh_execute('raspberry.local', 22, 'pi', 'raspberry', 'python3 dropHat.py') # Reset counter consec_detections = 0 # Delay before the next frame for 1 second time.sleep(1) THE GRAND VISION There is a bigger dream here. Picture a world where you can walk around New York City and everything you need is falling out of windows onto you. At a moments notice, at the drop of a hat. That's a world I want to live in. That's why I'm teaching you how to do yourself. Remember this as the first place you heard of "Window Shopping." dropofahat.zone Copyright 2024