Ordi

From Noisebridge
Jump to: navigation, search
OrdiBooth-Sophistiqué.jpg

Contents

[edit] Project Ordi

WE DID IT! Ordibooth got a Make Magazine Editor's Choice award! Two years in a row!

This project is a photobooth where a computer takes your picture and then draws it with a robot.

The project is named after and sponsored by a fictional retro-futurist company called Ordi. This is a potentially clumsy play on the French word for computer, ordinateur, which few outside of francophile countries are aware exists. The computer does most of the "magic" here, taking the photo and translating it into a half-toned path which is then drawn by a robot on a piece of paper.

[edit] Software pipelines

[edit] Networking

The application is a network of computers. The computer that acquires the images is a raspberry pi. We should not assume that it will have internet connectivity. The computer that processes the images is a minimac. It too should not be assumed to have internet connectivity but it does have two network interfaces. The ethernet interface connects the pi to the mini. The mini works as a router for the rpi. If there is a wifi network, the rpi will be able to receive external network connectivity through the mini.

Both endpoints on the wired network have static IP addresses, configured according to Debian standards of the
/etc/network/interfaces
file. The minimac has iptables rules, saved to
/etc/network/iptables
to do SNAT, acting as a router. In order for a picture captured on the rpi to be available on the minimac for rendering, it must travel over a network.

To solve the file transport problem with a minimum of custom software, the sshfs FUSE module is used. A directory on the minimac is "exported" to the rpi. In reality there was a public/private key relationship established between the two hosts and the rpi mounts a directory from the minimac via sshfs.

The minimac is notified of editions to this directory with a small program that watches a directory for changes.

[edit] Optical pipeline

Cheap PowerShot A2200 with CHDK. Raspberry Pi with touchscreen running a modified version of chdkptp. This controls the camera and allows remote shooting from the gui and saving the picture over the USB cable.

Lee started writing some software for a custom touchscreen GUI. It is the primary user interface for the booth, though there's a possible alternative touch-screen native framework called Kivy that would require a from-scratch UI project. This is out of the scope for maker fair. Probably worth it in the long run, for many reasons.

There is a 3D printed power adapter for the camera and a hacked battery pack but both are for different models of battery. The camera expects 4.3V and shuts down at < 3.3V. It expecte 0.800 amps to power on, but normalizes at about 0.200 amps. Lee bought two DC coupler power units, one refurb canon official and another a chinese replica. Both work well. The DIY adapter did not work reliably, for unknown reasons.

[edit] Rendering pipeline

There is a second computer which mounts the internal storage on the camera controller via SSHFS. The picture is opened in darktable for proofing prior to rendering. Darktable >= 2.2.4 is required for realtime updates as new pictures arrive from the camera pipeline.

When a picture is chosen, a darktable plugin is run that opens that picture in stipplegen to begin rendering a stipple cluster to generate a TSP path. Results of each iteration are shown in realtime on a display on the outside of the booth. After enough iterations, a recognizable image will form. Use your judgement and wait until you begin to see an image you like. The generator will export a vector of the TSP path crossing all the dots. This is what the axidraw will print.

There is a Processing interface to the AxiDraw serial port. This could be merged into a unified stipple, trace, print application.

The GUI for the rendering section is the most important graphical stage since it greatly effects the output image. The stipplegen GUI is good for a demonstration but has poor performance. It should be optimized for interactivity.

[edit] Printing pipeline

There are two current drivers for the AxiDraw to begin printing

  1. An interactive GUI built as a plugin for Inkscape. This requires exporting from stipplegen and importing the SVG into Inkscape and clicking around buttons.
  2. A Processing utility which can control the AxiDraw which includes a simulation program to show the toolpath

[edit] Automation

A few Ansible playlists should be used to automate building both the camera computer and the rendering computer. This could help facilitate different hardware configurations.

[edit] Sketches

More precise working drawings will follow soon, but here are some of the initial ideas.

The arrangement and proportions of the Boites

APBinitial-2.jpg

And here is the concept for the styling of the photo booth facade

APBinitial-1.jpg

Personal tools