Author Archives: jwothke
In addition to showing some cool raymarching based realtime graphics, my latest web page is dedicated to “269 Life” and the matching title is meant to attract some extra attention to that meaningful movement (see http://www.269life.com).
The realtime WEBGL web page can be found here: https://www.wothke.ch/269life/. You’ll need some sort of 3D graphics accellerator and a Chrome browser to use it. Or you can have a look at a youtube recording here: https://www.youtube.com/watch?v=Y3S3MY3Vuf4 (I am still looking for a volunteer to create a higher quality recording for me 🙂 )
I am not repeating the information that can already be found in the comment of the youtube video here. Instead I’ve added some background information regarding the techniques that I used in the page.
All the fractal graphics are created using knightly’s “pseudo kleinian” algorithm (see example code in “Fragmentarium”) as a base and parametering it with various “distance estimate” functions. An “orbit trap” based implementation is used to color the result. Depending on the specific “scene” a number of reflections is calculated (up to three). The “phong blinn” shading model is finally used in combination with a standard “ambient occlusion” implementation to render the fractal background (basically the same impls that I had previously used in “modum panem”).
Three different approaches are used to display text elements:
- “Flat” texts are created by using a font texture that is then displayed via simple triangles (two per character).
- Texts like the title or the ones in the “greetings” section are then based on extruded 3d fonts (see standard THREE.js examples).
- Finally there are the “particle based” texts that explode in the “greetings” section – which are created using regular canvas text rendering.
A “bokeh” postprocessing is applied to the resulting page to create a “depth of field” effect. (The respective implementation is derived from Dave Hoskins work.) The “bokeh” postprocessing is also used to create some interesting distortion effects on the overlayed title text (which is not using the same z-buffer).
Finally the “greetings” scene showcases the combination of “standard” THREE.js elements (particles, extruded texts, etc) with the shader generated fractal background: By having the fractal shader propagate its z-buffer information, the regular THREE.js overlays are later clipped correctly (thanks to Marius for the respective depth calculation – see boxplorer2).
The “neon signs” here are created via a postprocessing pass that adds a “glow” as well as a “god’s ray” effect. A simple random noise based shader is used to create the purple “northlights” on the horizon and 20’000 confetti particles provide for some action.
Thanks again to Wolf Budgenhagen and Darkmelo for letting me use their music.
Obviously there was something missing in my previous mandelbox experiment… exactly, some means of transportation so you can explore the scenery!
I therefore extended the original version such that “regular” 3D stuff can be mixed with the raymarching based fractal landscape.
also I added a bit of “collision detection” so that the landscape can be moved “out of the way” when your transport gets too close..
so enjoy your ride: https://www.wothke.ch/modum/#/wright-and-bastard/sets/evoke-2016
or have a look at the youtube recording: https://www.youtube.com/watch?v=bx0Aakv3JPE
Notice: Unfortunately Chrome seems to be the only browser that currently properly supports WEBGL_draw_buffers. (Firefox also claims to support the feature but in fact it is completely messing up the respective color attachments..
And to complete the trilogy here another bit of fractal fun (in case your PC is not equipped with a fast enough graphics accelerator you can find a video recording here: https://youtu.be/PzTwOfoZp0E):
I just did a little revival of my old WebGL fractal stuff. The two below screens use the latest version of my Commodore C64 emulator to play two fitting music creations by Markus Klein.
I have some older automated lawn sprinkler installed in my garden. So when the respective controller started to misbehave my first reflex was to try and repair it.. but not having any real clue about electronics, this proved to be somewhat difficult, and with the limited functionality that the commercial product had offered in the first place, I reckoned that it wasn’t worth the effort anyway.
But being tired of performing the same manual irrigation round every one or two days during summer – not to mention the vegetable garden – I thought that it might be a fun little project to design a replacement tailored to my needs, while learning a thing or two about simple electronics.
What I wanted was a system that would not just stupidly irrigate based on some predetermined schedule, but a system flexible enough to adapt based on actual irrigation needs. Also I wanted to be able to collect (on my PC) all weather/irrigation related data so that I will be able to verify that the system actually has the desired effect. It should then be possible to interact with the system without having to rewire the garden. Obviously the system should continue to function (and not lose sensor data) while my PC is switched off.
Side note: As a general principle I am trying not to rely on WiFi at all, but the place where the irrigation is supposed to happen is also actually out of regular WiFi range (unless I were to use additional repeaters). I would have liked to use some powerline based approach but rejected that idee (for now) due to the added costs and engineering complexities. I am not interested in mobile phone based approaches either: First of all the whole point of this exercise is “automation”, i.e. I want to prepare for those moments where I am NOT at home (e.g. when I am on vacation) and where I do not have adequate information regarding which of my plants might be in need of irrigation. Having a phone (or web based) “remote control” for me therefore seems to be but a rather useless gadget. Finally I don’t want to pay for a respective phone contract (that I otherwise do not need) just to irrigate my garden.
This being my first baby-steps in the realm of DIY hardware, I wanted to use something as inexpensive as possible – so that it would not hurt financially regardless of how many electronics components I might fry due to my noobish ignorance.. I therefore decided to base my experiment on some cheap chineese Arduinos. These things cost about a Euro each such that the single most expensive component for all of my devices finally proved to be the plastic case used to package them..
I ended up building three types of devices:
1) soil/air sensors: Equipped with a simple 433MHz transmitter these devices periodically broadcast their measurements, and various power saving techniques make sure that they’ll get through the season on their rechargable 9V battery.
2) RainMaker controller: This controls the irrigation valves (based on configuration and sensor data) and it also buffers and relays (to the PC) the data that comes from the sensors. It uses a somewhat more powerful transceiver for two-way communication with the PC.
3) USB interface: Simple device that is connected to the PC’s USB port. It allows the PC to communicate with the RainMaker device(s) in the field – or any other device that is based on my respective custom communication protocol.
When ordering the parts in small quantities from AliExpress the material costs for the different devices are: 10-12$ per sensor (depending on features), 12$ for the USB interface, 45$ for the RainMaker and another 45$ for the irrigation valves (6-way) assembly.
For the PC side I setup a simple DB to store sensor/irrigation data and I wrote a simple Java GUI used to visualize that data, and to interact with available devices in the field.
So far I successfully completed the dry-testing, and it will be time to test the devices in the field.. where it remains to be seen what additional insights will come of that.
Lessons learnt so far..
- With a bit of help from an electronics savy coach (thank you Markus! 🙂 ) doing a bit of DIY hardware can be a fun experience – even for electronics beginners.
- As soon as a MickeyMouse project gets just a tiny bit “bigger” you may quickly run into the limitations of the inexpensive/smaller Arduino’s hardware (e.g. you want to use a receiver and an additional transceiver, a LCD, a RealtimeClock, some Relais and an EEPROM and you may run out of I/O pins; and even when you circumnavigate that issue (e.g. using I2C and some PISO shift register) you might find that your program – with all the required libs – just barely fits into the available 32k FLASH ROM). Except for very simple devices, optimizing your stuff for size seems to be a part of the Arduino deal. I had last encountered these kind of problems on the C64 some 30 years ago 🙂
- What people seem to call an IDE in the Arduino world can be rather frustrating for anyone who is used to work with a real IDE (aka IDEA IntelliJ, etc). Calling it “AlmostNotepad” would be a more fitting description. Certainly you don’t want to make programming errors in that kind of environment.. ever 😉
- Actually there is a friendly / supportive Arduino community that may considerably help you on your learning curve.
Just a little experiment for how to synchronize visualization of additional data streams with the the playback of WebAudio music: The music samples are generated on the fly using a ScriptProcessor emulating some legacy AtariST. In addition to the stereo output the respective ScriptProcessor also generates three streams containing “playback volume” for the AtariST’s internal soundchip voices:
just for fun a more psychedelic WebGL based rendering of the same data (the WebGL here combines an orbit trap fractal with an inverse distortion, and the “music volume” is used to parameterize the rendering):
are certainly not a good idea if a program is supposed to be portable. Unfortunately that is exactly what ZXTune is using to parse the different binary music files.
“One of the rules of packed structs is that you have to access them directly through the packed struct type. In other words, you can’t, in general, take the address of a packed field and then access the field through that pointer, because the pointer type won’t reflect the packed attribute.” (sunfishcode)
Unfortunately ZXTune used boost::array instances within the various packed structs.. Problem: when methods are invoked on boost::array (or std::array, etc). The ‘this’ argument to the boost::array functions may be misaligned, but the boost::array functions themselves don’t know this.
On CPUs which don’t mind unaligned memory access you may get away within without realizing that there is a problem.. and in this case it was my attempt to cross-compile the program using Emscripten that revealed the issue. Not wanting to rewrite too much of the foreign code I opted for a quickfix: replacing the boost::array with a built-in array fixed the immediate problem…
Naturally a clean implementation should better refrain from depending on unaligned memory access at all… not all the CPUs are as forgiving as Emscripten.
(click on the below image for a live demo).
It was back “in the old days” and I remember my relief when some day I found out that all PCs were not necessarily mute: Thanks to some “walking frame” called “AdLib” they could actually make sounds… and a bit later things became pretty neat with the rise of Sound Blaster…
AdPlug plays sound data, originally created for the AdLib (OPL2) and Sound Blaster (Dual OPL2/OPL3) audio boards, directly from its original format on top of an emulator.
My latest bit of Xmas tinkering is a HTML5/WebAudio version of AdPlug (Thanks to Simon Peter and the other authors of AdPlug.). For a live demo click on the below image..
To complete the set of chipmusic emulators, here my WebAudio version of SC68 – an AtariST music emulator (that plays files like *.sc68 or *.sndh).
Eventough I had done some programming on the MegaST 4 back in the 90ies, I have to admit that at the time I had not realized that the machine had anything resembling a sound chip. Those were the days of 68k assembler, DSP machine code and GFA-Basic.. and we were just doing a Paintbox software for the “CHILI” framegrabber and realtime video-effects extension card… 🙂
But it seems the ST not only did have a MIDI interface but also a built-in sound chip…
Thanks to Photon Storm for sponsoring this little project.
This experiment once again confirmed my ealier observations that the debugger support built into today’s web browsers is utterly useless (but for the most trivial scenarios). So this not only was a travel back in time with regard to home computer music but also with regard to the modern developement tools that I had gotten used to: bye bye IDE – welcome back debug/trace output.