Deprecated: Function set_magic_quotes_runtime() is deprecated in /home/dfmoore/ on line 14





C++11 on Microsoft

Background Tasks on RPI

Posted 937 days ago by Dylan Moore ·

Laser projector and glow in the dark paint

More glow and laser info

read more

Posted 1009 days ago by Dylan Moore ·

Emptied Gestures

Charcoal drawings and kinetic performance

read more

Posted 1009 days ago by Dylan Moore ·

Magnetic sensors, next parts order shipping list


Programmable Hall Effect Sensor (~4 bucks)

Hall sensor application notes

Oscilloscope- DIY:
Or, buy for ~400$ (yuck)

Bonus points: Read up on fluxgate magnetometers and magnetoresistors

“The earliest magnets were naturally occurring iron ore chunks mostly originating in Magnesia hence the name magnes. We now know these materials to be Fe3O4, a form of magnetite. Their unique properties were considered to be supernatural. Compasses based on these magnes were called lodestones after the lodestar or guidestar. They were highly prized by the early sailing captains.”

Posted 1009 days ago by Dylan Moore ·

Time off to-do list

My list of to-dos, with three to five being randomly selected and given to me every day. Repeats allowed, limit three times. Probably weighted based on effort.

read more

Posted 1050 days ago by Dylan Moore ·

display tech scratchboard + pi

FP2800A -flipdot driver


port forwarding can go die in a fire.

Posted 1059 days ago by Dylan Moore ·

Mold Tracker: Notes

Temperature & Humidity Sensor

About this project

I think I may have some leaks in my apartment walls. I occasionally hear drips splashing down. I’d like to insert a sensor into my walls and measure how much humidity is kicking around, since mold needs water to grow. Appreciable levels of humidity will be a good indicator that something is wrong.

Since most of the Arduino community relies on website resources, I’m going to document my sunday afternoon project. I’ll be using the RHT03 sensor, available on SparkFun. I’ll be reporting data back to a MySQL database, and then using Processing to graph the data over time.

Background Information

Sensor Description
Arduino Library

Pinout looking directly at the part:
(1) (2) (3) (4)
Vdd Data Null Gnd

NB: Vdd and Vcc both mean the positive supply voltage. The naming convention comes from the type of IC or part it’s powering; VCC typically labeled for BJT’s, and VDD for FETs. What does this mean? It’s the + pin for your power supply, and GND is the -.

Supply part with 3.3v ->6v DC. It’s okay to connect to the 5v pin on your arduino.

We’ll be using a digital pin on the arduino that we can access with interrupts: Pin 7.
Connect a 4.7kohm resistor between VDD and the data pin. This is a pullup resistor and helps eliminate spurious ‘floating input’ data.

This pin gets no connection. Just leave it be.

Connect up to your arduino’s ground pin.


1. Getting data with the Arduino
2. Sending data to the computer
3. MySQL database queries
4. Graphing the data with Processing

Posted 1738 days ago by Dylan Moore ·

Core Audio & Drivers

Device Drivers

Audio Units


Package ddf.minim.effects

and (for audio units)
Thoughts on this later.

Posted 1875 days ago by Dylan Moore ·

Scraping for mohawk weather

curl -s --user-agent "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-us) AppleWebKit/533.20.25 (KHTML, like Gecko) Version/5.0.4 Safari/533.20.27" | egrep '(hbhDateHeader)|(hbhTDCondition")|(hbhTDTime)|(hbhTDHumidity)|(hbhTDWind)' | sed 's/<[^>]*>//g'

This will return the day, hour, temperature, and humidity—The things you need to know if you want to rock a 11” hawk out and about. Currently hardcoded for Vancouver, for when I hit up SIGGRAPH and won’t have a lot of international bandwidth to check these things.

Posted 1917 days ago by Dylan Moore ·

Good resource

Posted 2084 days ago by Dylan Moore ·

Radio Links

Jan 7 2011
AM Radio Recievers

Crystal Radio Circuits
I’m diggin the two transistor design.

Jan 11 2011
NOAA Weather Radio

NOAA Weather Radio All Hazards transmitters broadcast on one of seven VHF frequencies from 162.400 MHz to 162.550 MHz. The broadcasts cannot be heard on a simple AM/FM radio receiver.

Los Angeles: 162.550 Mhz

Posted 2118 days ago by Dylan Moore ·


So I decided to combine my geeky interests for a couple of hours tonight— I built a simple oximeter while watching election results. While the democrats really sucked at keeping the house, the good news is that the blood-oxygen sensor works. I can drop the reported numbers by holding my breath. It’s not calibrated to anything in particular.

If you want the sketches or schematics, drop me a line.

Processing sketch showing unfiltered results

Arduino project with light sensor, red LED, and infrared LED. First picture is taken with a normal camera, second with an infrared camera.

Some other research for using cameras!
photo-plethysmographic imaging
And the Wikipedia article

Posted 2183 days ago by Dylan Moore ·

Pulse Oximeter

Biometric sensors that can detect pulse and oxygen content in the blood are called Pulse Oximeters.

Here are two great examples of do it yourself pulse oximeters-

Mike Szczys’ hack-a-day diy sensor
And an Arduino based sensor

The DIY uses a red LED and an infrared LED (for baseline light measurements), and it seems to work well. The initial info I was reading mentioned the wavelengths 650 nm and 805nm.

The secret sauce is in the ratio conversion. From

After the transmitted red and infrared (IR) signals pass through the measuring site and are received at the photodetector, the R/IR ratio is calculated. The Red/IR is compared to a “look-up” table (made up of empirical formulas) that convert the ratio to an SpO2 value. Most manufacturers have their own look-up tables based on calibration curves derived from healthy subjects at various SpO2 levels. Typically a R/IR ratio of 0.5 equates to approximately 100% SpO2, a ratio of 1.0 to approximately 82% SpO2, while a ratio of 2.0 equates to 0% SpO2

Posted 2184 days ago by Dylan Moore ·

Structured Light scanning from scratch

Structured light scanning on the Mac
I’m putting together notes as I walk through the process of writing a structured light scanner from scratch for the mac. I’m going to put all of the code up here, but no promises until I get it working!

What camera I’m using:
I’m using a Canon EOS Rebel T1i because it’s wicked sweet. And supported ( ;
libgphoto2 has an impressive list of supported cameras, so I figure if I use that, other people can benefit even with different cameras.
Supported Cameras

What projector I’m using:
Just an old off-the-shelf sony projector. Nothing special. nothing expensive.

Step 1 Get all of the software parts you need to capture images from your camera.
I’m using MacPorts because it makes this kind of thing waaay less painful.
Get it here. After installing, type
In the terminal:
sudo port libgphoto2
sudo port gphoto2

Make sure your environment variables are pointing to the macport sandbox folders.
In the terminal:
cd /etc
sudo pico profile

and edit the PATH variable to include /opt/local, like this:
export PATH=/opt/local/bin:/opt/local/include:/opt/local/sbin:$PATH

That last :$PATH part is important, so you can set other environment variables elsewhere, and this just appends ours.

close your terminal window and open a new one after setting environment variables, or they won’t be in effect.

Step 2 Test that your camera can connect with libgphoto2 by hooking up your camera, turning it on (wait a second) and typing
gphoto2 --auto-detect
This is what I see.

Model                           Port                                            
Canon EOS 500D                 usb:            
Canon EOS 500D                 usb:038,003     

Excellent details on how to use the Command Line Interface (CLI) can be found here.

Step 3 PTPCamera will make you hate life. Kill it with fire.
So, if you actually try to grab an image, you’ll most likely get an error saying that something else is using the camera already. In this case, it’s PTPCamera, a daemon that launches on a macintosh every time you plug in a camera. This is good if you’re using iPhoto, Aperture, or something like that. The best way to tell if PTPCamera is running is to type:
ps -e | grep PTP
If you see TWO lines printed out after this, with some involved camera info, then this is the process you want to kill. I should mention that I ran into a great blog by DC Clark that helped me pinpoint the problem. His solution involves kill -9’ing the process (ending it in the terminal) every time you plug in the camera. Here’s what I’m doing instead:

  • Make a blank, plain text document and put the following in:
    echo "Remember, you disabled PTPCamera so you could play with gphoto2."
    exit 0

and save it as PTPCamera, with NO FILE EXTENSION. If you have extensions hidden by default on your mac, turn them on and make sure this doesn’t have one.

  • Open up a terminal window, and type:
    sudo chmod +x and then drag the text file into the terminal to add its path. Hit enter. This makes it executable by the system.
  • In the finder, in the “Go” menu, choose “Go to folder” and type
    /System/Library/Image\ Capture/Devices/
    where you should see a lonely executable sitting there called PTPCamera. We want to keep this file around, so we’ll need to give it a new name, like “PTPCamera Backup”. Renaming it is a little weird, since it’s in a special permissions place. You might want to drag a copy to the desktop, rename it there, drag it back in (authenticate), and then delete the originally named file. Then, move your text file into this folder too.

What you should have: The old PTPCamera daemon, renamed to something else, and your executable script named PTPCamera.

What does this all do? When you plug in a camera, it’ll run the script instead, print a message to the system console reminding you that you were mucking with this, and then exit with a return value of zero. This will make the system think the daemon is happily running, while in reality, it never gets launched.

How can I undo this? Get rid of your script, and rename the PTPCamera Backup back to “PTPCamera”. Good as new.

Step 4 Setting up an Xcode project
I’ll eventually be setting up an Objective C project, but for now, I’m going to work with a simple standard C tool.

  • Start by creating a new project with the “Standard Tool” template.
  • Right click on the project icon and choose “Add>Existing Frameworks”.
  • Navigate to /opt/local/bin/ and select gphoto2, gphoto2-config and gphoto2-port-config. Add those to your project.
  • Also navigate to /opt/local/lib/ and select libgphoto2_port.0.8.0.dylib, libgphoto2_port.0.dylib, libgphoto2_port.dylib,, and.. well, any other libgphoto dylib’s and la’s.
  • Next, right click on the project and choose Get Info. Under the build tab, find “Search Paths”. For “Header Search Path”, write /opt/local/include/. Under “Library Search Path”, write /opt/local/lib/.

Posted 2193 days ago by Dylan Moore ·

Structured Light 3D scanning links, odds n' ends

Control that fancy camera of yours with this:

Structured Light 3D scanning: Instructables

Open Processing 3D scanning

Structured Light reading
…and the new home for that project.

The manual for my canon, which apparently needs a power cord. Psh.

The amazing Douglas Lanman / Gabriel Taubin project Build your own 3D scanner which is really worth a look.

And some others:
Augmented Engineering

Posted 2204 days ago by Dylan Moore ·

Some good references for color spaces and blending

A good reference that explains premultipled alphas well:
Premultiplied alphas

A good page on Y’CbCr:
and the obligatory wiki page:
and how to use it with quicktime:

Posted 2345 days ago by Dylan Moore ·

Maurice Maeterlinck

At every crossroads on the path that leads to the future, tradition has placed ten thousand men to guard the past.

Posted 2406 days ago by Dylan Moore ·

Current HDRI reading list

High Dynamic Range Imaging: Acquisition, Display, and Image-based Lighting
[Erik Reinhard, Greg Ward, Sumanta Pattanaik, and Paul Debevec]

Color Imaging: Fundamentals and Applications
[Erik Reinhard, Erum Arif Khan, Ahmet Oguz Akyüz, and Garrett Johnson]

The HDRI Handbook
[Christian Bloch]

Posted 2472 days ago by Dylan Moore ·


Posted 2493 days ago by Dylan Moore ·

Open Terminal Here & Automator

For those of you who use terminal regularly, and want a few ways of getting to it quicker, here are some goodies.

1. Open Terminal Here
This is a script that you can drop into the toolbar of your normal finder window, and when clicked, will launch a terminal window that’s been cd’d to that directory. Really cool, if you want to try something different from the usual ‘drag the folder icon into the terminal’ shortcut to get to a specific place.

2. I like hot keys. It’s true. And I wanted a hot key to launch terminal, and I didn’t want third party stuff choking my OS. Here’s the solution I came up with.: Make an automator project that opens terminal, and save it as a Service, which puts it in the app menu; under “Keyboard prefs” in the system prefs, you can now set a hot key for it. I opted for Apple-Shift-T, which mirrors Apple-Shift-A and U for applications and Utilities in the finder, but it’ll take away “convert to plain text” in textedit. That’ll bring terminal up from any app.

Here’s the automator project for download, though it’s super-basic.

1. Uncompress & drop it into ~/Library/Services/ (Or open it in automator & save.)
2. Add your shortcut in Keyboard Prefs.
3. Rejoice.

Posted 2500 days ago by Dylan Moore ·

Toolchain software: AVR & Terminal

Open Terminal Here
AVR Toolchain Installer
AVR Downloader UploaDEr: AVRDUDE
ATmega basic config
AVR Fuses, thanks LadyAda!
Fuse Calculator
Finally getting around to posting my toolchain links that have been kicking around for three or four years, in various text docs and emails. Enjoy.

PS why are all of the ATMega8, 168, and 328 thin quad flat packs out of stock these days?

Posted 2501 days ago by Dylan Moore ·

Silkscreen Supplies in Santa Monica

Wasserman Silk Screen Co‎
1664 12th Street
Santa Monica, CA 90404-3710
(310) 450-6777

View Larger Map

And a great link for printing with white ink.

another supplier, Westix

Posted 2511 days ago by Dylan Moore ·

Cross country move, and ready to make art.

Everything since SIGGRAPH has been a blur. I interviewed and was hired on to the ProApps team at Apple, as a software engineer a little over a month ago. Since then, I’ve picked up my life and shlepped across country to California to start life after school. Scary, but amazing.

In other news, I finally got to see the promo piece from IDMAA this year that used part of my interview—I recommend going this year if you have the chance.
[ IDMAA Video ]

Other thoughts-
tilt shift photography
and the armmite pro.
More to follow.

Posted 2556 days ago by Dylan Moore ·

Updates on 100 projects:

1. PrairieDogs: Chris Jensen will be presenting our pDog research at the Ecological Society of America this August. I’m very excited to hear how the work is received; there hasn’t been a whole lot of individual-model simulation software developed for ecologists, so hopefully my software and Chris & Jen’s research with it will make a good impression.
ESA Annual Meeting, 2009-Virtual prairie dogs weigh in on the Resource Dispersion Hypothesis

2. Meros has been accepted to the SIGGRAPH 2009 SpaceTime Gallery.

3. I’m working with Melanie Crean on some visualization software for the Shape of Change project. I’ll post links once we have it all online!
Shape of Change

4. Full steam ahead coordinating the Emerging Technologies venue at SIGGRAPH this year. We have fantastic contributors showing off their latest research. If you happen to be in New Orleans from August 3rd-7th, Etech should be on your list of things to check out!
Emerging Technologies

5. The great job hunt continues now that I’ve finished my MFA. Great time for a recession, let me tell you.

Posted 2646 days ago by Dylan Moore ·

More motion capture links

Single camera motion capture with Maya and Boujou

Motion capture without markers

Posted 2690 days ago by Dylan Moore ·

FACE pt 1

An update on a new project that I’m working on with Aaron Cohen called FACE: Facial Animation Capture Engine. FACE is a research project here at the Digital Arts Lab , conceived and directed by Rob O’Neill.

The goal of this project is to make a highly versatile, usable open source motion capture system that can be made at home for less than 350$. Right now, we’re using two b&w i-fire cameras (~160 each) set up in a stereo pair. The software is being developed in Xcode, using OpenCV, GLUT & OpenGL, and cvBlobsLib. Software and plans will be available soon, via the Digital Arts Lab (DAL). Right now, we’re using two desk lamps to light retroreflective spots of tape. It works surprisingly well; as the setup and software matures, I think we’ll have a very inexpensive and reasonable system for students and independent artists to use.

(Please excuse the tron-light-cycle style 3D grid. We needed something to show tracked points, and well, didn’t want to spend more than five minutes on that part. The final visuals will be much, much better!)

What you’re seeing: The two camera rig with desk lamps, and me with a spot of tape on my finger. On the screen: The upper half shows the two video streams, right and left camera respectively. On the lower half, the tracked point in 3D, reconstructed from the corresponding points.

Posted 2698 days ago by Dylan Moore ·


What my student loans really went to. (PS I need a new bag of peanuts.)

Want video and a few more pics?
Read more.

read more

Posted 2708 days ago by Dylan Moore ·


A long overdue overview of some research I was a part of while at MERL.

Prakash: Lighting-Aware Motion Capture Using
Photosensing Markers and Multiplexed Illumination
R Raskar, H Nii, B de Decker, Y Hashimoto, J Summet, D Moore, Y Zhao, J Westhues, P Dietz, M Inami, S Nayar, J Barnwell, M Noland, P Bekaert, V Branzoi, E Bruns


I totally dig that there’s a picture of me on the wikipedia motion capture entry. w00t.

More info at Ramesh Raskar’s MIT page

Media Coverage:

Posted 2711 days ago by Dylan Moore ·


Accepted for SIGGRAPH 2009 SpaceTime Gallery, August 3rd-7th.
Installed at the Museum of Computer Art, Brooklyn NY, April 2009.
Showed at the Manhattan Center, May 12-16 2009.
Completed for my MFA in Emerging Digital Arts from Pratt Institute, May 2009.
[3:59, 60mb. QUICKTIME. Please click below to begin loading.]

Click to Play!

My full written MFA thesis, including technical details can be viewed [ here ]
(16mb, PDF)

Posted 2721 days ago by Dylan Moore ·

Using OpenGL in a LibDC1394/Xcode project

1. Include OpenGL related files at the top of Main:

//OpenGL related stuff:
#include <OpenGL/gl.h>
#include <GLUT/glut.h>
#include <OpenGL/glu.h>
#include <OpenGL/glext.h>

2. Include the OpenGL and GLUT frameworks into your project:

- Right click on the project file list, and choose “Add>Existing frameworks”

- Navigate to /System/Library/Frameworks/ and find “OpenGL.framework”

- Also add “GLUT.framework” in the same way.

3. Create a few callback functions for GLUT:

void display(void)
//Drawing commands to go here later...
    std::cout<<"Hello, World!\n"; //print every frame.
void reshape(int w, int h)
    glViewport(0, 0, w, h);
    gluPerspective(40.0, (GLfloat) w/ (GLfloat) h, .2, 1500.0); 
    gluLookAt(5,5,5, 0, 0, 0, 0, 1,0); 

4. Set up main() to work with GLUT:

int main (int argc, char * argv[]) 
    glutInit(&argc, argv);
    glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH);
    glutCreateWindow ("FACE!");
    glutReshapeFunc (reshape);
    return 0;

NB: I’ve changed what main expects as parameters.

Compiling at this point will give you a blank, rendering 640×480 window with a camera set up and a unit cube sitting on the origin.

Next article, I’ll show you how to bring OpenCV into this project and work with the images coming from a pair of cameras.

Posted 2722 days ago by Dylan Moore ·

LibDC1394 and Xcode


1. grab libdc1394, rev 2 from sourceforge
(there’s a link from the project’s homepage,

2. open terminal, and type “cd “ (thats with an extra space) and drag in the folder you want to open up— in this case, the libdc1394 folder with the file named “configure”. It will end up looking like this:
DAL-Sputnik:~ hobbes$ cd /Volumes/Olympus/MoCap/libdc1394-2.1.0/
hit .
3. type “./configure”
4. type “make”
5. type “sudo make install” and your password when prompted.

6. Open Xcode and create a new project, of type “C++ tool”.
7. Right click on the blue project icon in Xcode and hit “get info”.
8. Navigate to the “build” tab.
9. Find “Header Search Paths” and double click. This should bring up a window where you can add a new path. Clicking on the “+” button lets you type in a new path; add “/usr/local/include”. Hit Ok to close the window.
10. Find “Library Search Paths” next, and add a path like before. This time, add “/usr/local/lib”.
Steps 9 and 10 allow your project to link against the installed libdc1394 library. We have one last step to integrate it into your Xcode project.

11. In the Finder, click on the “Go” menu, and choose “Go To Folder”. Type in “/usr/local/lib”
12. Find a file called “libdc1394.dylib” and drag it into your Xcode project window, on the left side to add it to your list of project files.
13. In the main.cpp file, add this code:

#include <dc1394/dc1394.h>

14. Click “Build and Go” to verify that your project can link against libdc1394. If all goes well, you should see it compile and run, and give you a “Hello, World!” in your console. True, it’s not anything camera related yet, but we’re making sure that everything is correctly connected—which would throw errors at this point, if it wasn’t.

At this point, you can drag in “grab_color_image.c” from the libdc1394 examples folder, uncheck ‘main.cpp’ and build and go with the example. This will use a firewire camera, so be sure you have one hooked up. If successful, you will see a line in your console saying something like:
[Session started at 2009-05-13 19:01:04 -0400.]
Using camera with GUID 8144361000226ba
wrote: image.ppm (921600 image bytes)
FACE has exited with status 0.

You will find a file called ‘image.ppm’ in your build/debug or build/release folder. Photoshop will know what to do with this file, though it might not look correct. That’s okay, chances are that the program wasn’t set up for your camera yet.

If you’ve made it this far, congratulations! Read through the documentation and explore. Next up will be an article on viewing your images in OpenGL, and processing them in OpenCV.


Posted 2722 days ago by Dylan Moore ·

OpenSource Facial Mocap Clipboard

1. QTkit for brightness/contrast

2. IIDC:


Posted 2724 days ago by Dylan Moore ·

Posting Mouse Events

No idea how fast it is, but I’m going to try it with my touch table tomorrow.


read more

Posted 2796 days ago by Dylan Moore ·

Drop me a line ( =

I see a lot of visitors from interesting places all over my webstats. Next time, email me and say hello!
dylanmoore (at] pratt [dot) edu.

Posted 2798 days ago by Dylan Moore ·

It's Alive!

Pictures of my multi touch table, and the obligatory movie of IR blobs. I don’t have the compliant surface on yet, so it’s a little faint. But you can still see the frustrated total internal reflection (FTIR) effect. Hoorah.

(photos in full post)

read more

Posted 2799 days ago by Dylan Moore ·

Mouse control...

The answer is in max/msp’s aka.mouse object…

Masayuki Akamatsu, will you be my hero?

Posted 2800 days ago by Dylan Moore ·

Data Visualization

As I’m developing my own viz tools to see hundreds of thousands of color points for my thesis, I’m going to quickly whip up some tools in Blender (why not give it a try?) and in Python (so quick…)

The biggest problem will be normalizing my high dynamic range (HDR) data into normalized 8 bit-per-channel color ranges (LDR) without getting rid of my initial data.

Setting Vertex Colors from Python

Posted 2800 days ago by Dylan Moore ·

Table Progress, Tuesday Night.

I have the majority of my wooden frame built for my multi-touch table. I’m glad i went the route of building it all myself. After adjusting a few parts to increase rigidity, I’ll be ready to install the rest of the IR LEDs and try a test run…!

I’m recording video of all of the construction progress, and I’ll be adding them to a yet-to-be-made youtube account. From there, I’ll post them up here. I’ll eventually be making all of my table plans available to whomever wants to make a table, but doesn’t want to take the time to design it.

(PS… It’s tuesday. Thanks, isittuesday,com!)

Posted 2807 days ago by Dylan Moore ·


HID Keys project for AVR

USB 2.0 Developer Docs

More AVR-USB Stuff

Posted 2813 days ago by Dylan Moore ·

Is it tuesday?

My favorite thing in the world isn’t just that exists…
It’s that has an RSS feed.

Posted 2814 days ago by Dylan Moore ·

Off Topic

This is unrelated to my thesis or research, but it’s one of today’s issues that I feel very strongly about: Equal Rights.

This could be a long post, moving from the 3/5ths compromise, to the Womens’ suffrage success in 1920, to the definition, and redefinition of marriage. But I’ll spare the fifth grade civics book report for another time. I’ll simply jump right in and say that I would have thought that our country had come to a general consensus that Jim Crow laws were wrong—or at least in violation of the constitution. We’ve had a dozen or so episodes in our history where people have fought back against oppression, and won. History for which we either celebrate holidays, or at the very least openly acknowledge as progress for all citizens. All citizens.

But new Jim Crow laws are here, separate but equal all the way down the line. Not sure what I mean? Check it out:

Same-sex marriages
States granting rights similar to marriage
States granting limited/enumerated rights
Foreign same-sex marriages recognized
Statute bans same-sex marriage
Constitution bans same-sex marriage
Constitution bans same-sex marriage and other kinds of same-sex unions

Cause people loving each other is really something the states have to pass laws against. This isn’t just an issue for gay people to take up—the underground railroad didn’t run itself. This is an issue for all citizens. The rights of any one citizen are inalienable… and the second its okay to deny some for a few, you’re opening the door. I’ll end with one other thought: I’ve never been comfortable with how close some religions and homophobia walk with each other. I’m pretty sure this map is a good indication of just exactly where the erosion between church and state is at its worst. …If you have a minute, check it against this year’s presidential electoral map. You may see some similar artifacts…

Posted 2828 days ago by Dylan Moore ·

Yeah. No more coffee for me.

URL clipboard for friday morning:
(… I’m working my way up to 2009, one shader at a time…)

maybe I’ll sleep tomorrow? ( =

Posted 2833 days ago by Dylan Moore ·

GPU Gems

GPU Gems Edited by Randima Fernando can be found as an HTML version on nvidia’s developer site. (This is old news to many, I’m sure. But… Yay nvidia!)

[ Link ]

I’m particularly drawn to the section on image processing on the GPU, specifically chapter 21, which talks about creating a real-time glow. Great for the algorithm, but I’m more of a GLSL guy than Cg—so here’s some GLSL specific links:

[ Image Filtering with GLSL ]

Posted 2833 days ago by Dylan Moore ·

i-fire cameras for my table

Okay, I bit the bullet and purchased a monochromatic i-fire cam from unibrain. I chose it over a color camera for a few reasons: For one, the bayer filter applied to the CCD reduces the wavelength of something like two-thirds to three-fourths of the CCD against IR and near IR. I’m not saying it won’t be able to see the IR, I’m saying it just won’t see it as well. Secondly, because of the interpolation, it loses some of the resolution, from 480(b&w) lines to 400(color).

See more information on bayer filters [ here ]

Posted 2833 days ago by Dylan Moore ·

I <3 Google!

In the normal course of using Google Maps today, I discovered they replaced the typical white hand cursor with a black one… very cool.

Happy Martin Luther King Jr. Day
(Another graphic brought to you by the awesome Shepard Fairey!)

Posted 2836 days ago by Dylan Moore ·

URL pasteboard

Posted 2836 days ago by Dylan Moore ·

Frame Buffer Objects

A full example in one source file of how to render to a texture using an FBO, and then texturing a box with it. Works in Xcode, but uses GLUT so it will probably work in whatever you use. Click on read more for extra info…

Keywords: FBO, GLUT, Macintosh, Xcode, Platform agnostic, Render to a texture, example, tutorial.

read more

Posted 2839 days ago by Dylan Moore ·

Displacement map / Normals & Lighting

Some great tutorials are available on lighthouse3d.

I’m reading through these right now for some help on my vis system later next month:
Smoothing with a Matrix Filter
Simulating Lighting Computations without Lights

Posted 2841 days ago by Dylan Moore ·

Yeah, this is our apartment.

Kelly, my housemate, has totally pegged our apartment. It looks like this:

PS … it’s:
sudo osascript -e “set Volume 10”

Posted 2842 days ago by Dylan Moore ·

PCB ordering resources

I’m considering getting my final boards professionally etched, screened & masked. I’ll even pitch in for some real solder paste to reduce the urge to put a bullet in my head by the end of feb. Don’t get me wrong, I love etching my own boards, but I’ll be working with some tight pitches and 100% SMD components, and I’m feeling a little spoiled with having used solder masks in the past.. they’re so nice. – Used these guys at MERL for the low-quantity boards for the mocap system. Good turn around time, okay prices. They’ll be my baseline for pricing other sites.

Also, a good tip from an instructable for PCB etching—drilling thru-holes for components is rough. The drill likes to wander. This is how the author of the instructable suggests solving the problem:

“Here’s my secret to drilling lots of tiny holes with a hand-held drill: use a scrap piece of acrylic as a drill guide. Drill a hole in the acrylic, then drill through that hole and through the board. The clear acrylic makes it easy to line up the drill bit correctly on the center of each pad. After a dozen holes or so, the “guide hole” in the acrylic will start to “loosen up” — just drill another guide hole & keep going. “

Nice! Here’s a link to the whole thing, if you’re unfamiliar with this technique:

Posted 2843 days ago by Dylan Moore ·

← Previously