The FABricators

The FABricators

The FABricators are made up of members of Fab Lab Tulsa a non-university associated Fab Lab. We come from a divers background of Engineers, Programmers, Makers, and Designers.

We are ready to have some fun!

Chat with team The FABricators


The FABricators Video:

The FABricators 2014 updates:

2013 Archive

Wheelchair controller joystick interface

Posted on: February 26, 2013

Deconstructing an electric wheelchair to use as our robot platform worked really well.  It saved us a lot of time because it already solves many of the hard problems associated with building a robust outdoor robot.

In order to interface with the wheelchair’s controller, we replaced the signals generated by the joystick with our own signals generated by an arduino.

The wheelchair’s joystick has some interesting features to improve safety which make it more difficult to interface with.  For one, each axis of movement has a pair of differential voltage signals.  When idle, each reads 2.5v but when the joystick is moved, the voltage on one signal goes up while the other goes down.  Also, the minimum and maximum voltage for each signal are not 0v to 5v but instead are 1v to 4v.  Both of these features enable the wheelchair controller to detect a broken or shorted wire so that it can produce an error rather than send the rider off in some undesired direction.

Once we understood how the joystick signals behaved, we designed a setup to reproduce them with our own controller.  We used 4 PWM outputs on an arduino and a custom RC low pass filter board to implement our interface.  If we had used an inverter as part of the low pass filter circuit to generate the differential voltage pairs, we could have just used 2 PWM outputs but there was no need to in our case.

Here are the eagle files we used for the board.

joystick drive PCB

Code on GitHub

Posted on: February 25, 2013

Our vision code, based around ReacTIVision, can be found on github at:


Final video

Posted on: February 25, 2013

Here is our final video submission for our deconstruction project.  We had a lot of fun making it!

Final Video

Street Art Bot documentation

Posted on: February 25, 2013

For the past 48 hours, Fab Lab Tulsa has seen more action that could be captured on camera.  Literally!  With all the people and devices flying in and out, we lost all but 2 cameras for our Hangout live stream.  But that doesn’t mean nothing happened. NO! We have fought and powered through many bugs, glitches, wireless buffers  and sleepless hours.  This is our (mostly) complete documentation of the Street Art Bot.

Problems: 1. You have a message/picture you want the world to see.  But bill boards and TV/Youtube ads are expensive.

2. You have/use a wheel chair and want to make side walk chalk images.

3. You have a very big image you want to print on a street/parking lot/grassy patch.

Solution: 1.   Take the street art bot out to your choice of street/parking lot/grassy patch.

<boring picture of street/parking lot/grassy patch>

Boring Ground

2.  Set up camera poll.

<amusingly tall poll and robot>

Poll and robot

3. Activate the python script.

<lots of code>

Lots a code

<Robot ready to drive>

finished printing

4. Drive the bot around.  Picture will be automagicly filled out by the bot.

<awesome picture of robot driving around>

driving robot and printing

5. Take friends/people you want to see picture out for an airplane ride over picture (optional)

<freaking awesome picure of final painting as seen from plane>

Arial photo from plane

6. Profit!

The image/message of choice is loaded by a python script where it is split into Yellow, Cyan, Magenta, and Black.  Then the computer takes video from the camera, and using reacTIVision, locates the robot on the ground.  After comparing the robot’s location to the image, decides if chalk should be sprayed or not.  After doing this several times, and moving the bot around, an image is formed.

As you can see, this is the best way to put your message out there.

Or, to share your lolcat photos. 🙂

The Team

Posted on: February 25, 2013

Not since Lake Placid has the world seen such an awesome team assembled.  A total of 11 people contributed to our project throughout the weekend including.


Scott “The procurer” Rainwater


Blixa “TV Star” Morgan


Andrew “The Man with the Plan” Harmon


Patrick “Code Ninja” Forringer


Jeremy “The Executioner” Zongker


Zack “Circuit Master” Hale


Angela “Pyrotechnic” Mareschal


Dana “The Brain” Swift


Dan “AWESOME!!!” Moran


Scott “Eye Role” Phillips


Kendall “Go big or go home” Phillips





In your face Interface

Posted on: February 25, 2013

This is a shot of our super handy debugging interface to our python (thanks Jeremy and Patrick!)

Look at all those wandering paths!

Look at all those wandering paths!

Look 'ma the slowest dot printer around!

Posted on: February 25, 2013

So this is the dot pattern that we ended up with (about 20 min of printing after the live feed).    The idea is to actually use a swarm of autonomous robots wandering random paths each with their own marker. They could be released over a large area, observed by multiple cameras, communicated to wirelessly over Xbee or similar and paint huge areas in many colors!

The first outdoor print

The first outdoor print

You can see all the aspects of the project (minus ‘the team’ ).  There is the Deconstructed motorized Wheelchair, the webcam mounted on the cardboard tube, the laptop running the python software and reactivision (our python software is on git hub), an arduino and salvaged mosfet, and a red solo cup (I drank out of it before we filled it with chalk water).



‘The nozzle’

Posted on: February 25, 2013

Please wait while ‘the nozzle’ calibrates….. the nozzle is calibrating…… please remain still while ‘the nozzle’ continues calibrating…..


So this is our Deconstructed super scientific chalk water depositor.  This is a motorized insecticide dispenser (We don’t think it had ever been used….but it might have smelled a little funny).  It is a gear pump that is driven by a geared down hobby type motor.  By connecting this to a salvaged mosfet from a power supply we were able to turn it on and off with an Arduino.


The nozzle is centered directly under a reactivision fiducial marker that is tracked by the camera.  It is held in place by a repurposed 3d (makerbot) printed cable chain unit piece precisely hot glued to the marker.

Please wait....... the nozzle is calibrating

Please wait……. the nozzle is calibrating

FABricators Tenetive Schedule

Posted on: February 21, 2013

Thursday all day:

    Prepare the web site (contact me if you want to help with this)
    decide on live stream locations.
    Spread the word.

Friday morning from 11am to 4pm:
    Set up live streams.
    Get materials together.
    Spread the word.
    Start the live stream.
    Start hashing out ideas.
    Event Officially starts.
    Tune into main feed for announcements.
    Go home.
Saturday 23rd, 9am:
    Check and update live stream.
    Start working.
    More work.
    head home.
Sunday 24th, 9am:
    Check and update live stream.
    Start working.
    Start final touches to project.
    start video.
    Stop work.
    Finish video.
    Project Presentation over skype.
    We’re done, and hopefully won.
Monday by 7pm:
    have final video posted and documentation online for judging.