http://nett.com.au, neurontin online #sgcity.org, buy lamisil online, cost of generic cheap tadalafil online uk #geko, cmbm.org, http://nett.com.au, nett.com.au, alesse online, motilium online #cmbm.org, nett.com.au#buy-cephalexin, cmbm.org #robaxin, nett.com.au#buy-valacyclovir-no-prescription

The Misadventures of Quinxy von Besiex truths, lies, and everything in between

18Sep/120

Router with slow download and upload speeds? Don’t connect it through your UPS’s over-network voltage protection!

This is just a technical note for anyone who winds up with the problem I did the other day...  I bought a new wifi router the other day, hooked it up and was shocked to find my download and upload speeds were  abysmal, in the 1 Mbps range (rather than the expected 75 Mbps down and 20 Mbps up).  I spent an hour trying different firmware in the router, looking for problematic options in the router's config, and replacing cabling.  But somewhere in my sleeping hours the answer came to me...  My cable modem's network cable feeds into my UPS which then feeds into the wireless router.  The UPS provides voltage filtering, arguably useful for preventing distant lightning in my rural neighborhood from destroying more networking equipment than it otherwise might.  The idea came to me that the UPS network feature might not be gigabit rated.  My old wifi router was not a gigabit router, thus it would communicate with the cable modem at 100 Mbps, within the rating of the UPS.  But the cable modem is likely gigabit and the new wifi router is gigabit, and thus they would communicate at gigabit speeds.  And the UPS may unwittingly screw up that speed of communication by the filtering it does on the signals passing through.  I bypassed the UPS and sure enough everything now works as expected!

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
Tagged as: , , No Comments
21Jul/121

The Journey of the M820



I picked up my 1971 AMC M820 Expansible Van from Mt. Vernon, IL this past week.  It turned out to be quite the little naive odyssey.

The outbound journey went relatively uneventfully, got a one-way rental to drive myself, Francine, and Osita (the dog) from Pennsylvania to Illinois.  After a thirteen hour drive spread across two days we met the seller, Wade, got to test drive the vehicle, bought it, and retired to the hotel to contemplate our next move.

Luxury It Ain't

Three major obstacles became immediately clear when I saw the vehicle and got to drive it.  The first problem was that the vehicle's cab was tiny.  There was arguably room enough for three lean soldiers with little to no gear and little to no leg room.  But two average folks and a dog would not fit easily.  The second problem was that the engine was deafeningly loud, the Army having made no effort to provide a quiet cabin.  The third and most serious was that Illinois happened to be in the middle of a record breaking heat wave and daily temperatures were reaching 107 degrees Fahrenheit.  An un-air-conditioned cab combined with a heat radiating engine and transmission was a recipe for disaster.  The fact that my dog is super fluffy and inappropriately keeps her winter coat on until September didn't help.  I knew I had to solve all three issues before we could start for home.

Where Does a Dog Fit?

There was only one place Osita would fit and that was on the floor board.   I put down a furniture moving pad and a dog bed to cushion the harsh metal floor and cover up sharp edges.  The difficulty was that she is a large dog and her body took up all the room of the passenger's floor as well as all the room in the middle floor.  Her upper body was wedged between the transmission's stick shift and the high-low transfer case shifter.  She had to keep her head up and out of the way whenever I needed to shift gears, which involved quite a lot of work on Francine's part.

Osita was a real trooper.  She would instantly find her place whenever I had to lift her back in, and she didn't move around at all once she settled.  I think it was all the practice in the motorcycle sidecar that touch her such patience for us humans.

Francine was an amazingly good sport for having to put up with very limited leg room and the constant need to keep Osita out of the way.

Silencing the Deafening Roar

For us humans the solution to the engine noise was easy.  We wore ear plugs.  Protecting Osita's hearing proved a little more challenging, and in retrospect I'm not sure how much good it did.

I bought Francine and Osita two pairs of the best headphones Lowe's had to sell.  Francine could wear hers without modification, but Osita's pair required some changes.  I removed the adjustable metal band at the top and replaced it with two straps which could be tightened or loosened with Velcro.  I also added a chin strap whose length could also be adjusted.  The system worked, but only sort of.  My primary concern was that her ears are vastly bigger than ours, and while I could (barely) fit her folded up ear into the headphone ear cup I couldn't imagine that it was pleasant, and I couldn't be sure that the seal was all that effective in terms of loudness protection.  I abandoned this solution in the end after a few short trials on the road.  They came off too easily and I was just too afraid it would hurt her ear cartilage if left on too long.  The only fallback I had available was to use human foam earplugs.  I did some Googling and saw people specifically recommending against their use, since human earplugs are smaller than what dogs would need.  Without any alternatives I decided to give it a try anyway, but instead of using just one per ear I would use two together in each.  This approach seemed to work and would stay put.  To what degree it eliminated the sound I can't be absolutely sure.  I know when I use a pair they can be finicky; they may seem to be in right and yet need adjustment to block out all the noise.   I felt somewhat comfortable, hopefully not foolishly, that her hearing would be protected because I had just a few days earlier read a passage in a book, How Dogs Think?, that mentioned  dogs having a biological mechanism by which they can protect their hearing from loud noises (environmental ones that they can expect, versus isolated and unexpected ones like gun shots).  If the ear plugs didn't do enough presumably her biology would.

Cooling the Air

Finding a solution for the 107 (and higher) degree heat was the big problem.  On the route down I'd tried to improve upon our rental car's poor A/C by buying a few bags of ice and putting some inside zip lock bags distributed in the passenger compartment and some in disposable aluminum pans on the floor board.  That did nothing to cool the interior.  I knew that the complete lack of space in the cab made it impossible to improve upon this crude method by simply adding more ice.  Instead I decided to do the only thing I could think of, create a rudimentary air conditioning system that was powered by ice, with the ice located outside the cab.  And that's what I built.

The key components of an air-conditioner related to the design I was going to employ were a refrigerated liquid, some cold coils that would transfer the cab's heat into the refrigerated liquid, an electric fan to accelerate that heat transfer, a pump to facilitate the circulation of the refrigerated liquid, an insulated container to hold said liquid, and hoses to carry the liquid to and fro.  I went to the local Pep Boys auto supply store and bought a third-party automatic transmission oil cooler to use as my cold coils, an electronic radiator fan to use as my fan, and fuel line to use as my hose.  At the local Walmart I found the bilge pump and large insulated cooler I needed.  And a quick trip to Radio Shack got me the switches and wires I'd use to allow me to turn everything on and off at will.

Retreating into the hotel room and out of the heat's insanity I assembled all the parts.  At this point I really wasn't sure how efficient the system would be, just how well it could remove heat from the cab, assuming a sufficient quantity of ice.  Once I'd installed everything in the vehicle and got a chance to test the system I was very pleased to discover that the system was very efficient at removing heat (that is, blowing cold air).  Even so, I wasn't sure if it would be cold enough, the hottest part of the day had already passed.  Remembering something from my high school earth science class I went back to Walmart to buy four big boxes of rock salt, which I knew would dramatically lower the freezing point of water and thereby drop temperature of the ice/water even further.  I brought along a big bag of tools for this trip and in it my infrared thermometer (it's a useful tool for motorcycle carburetor tuning).  I found that adding the rock salt dropped the temperature of the ice/water slush from about 32 degrees to about 3 degrees, which significantly improved the cooling in the cab.

Everything was very nearly a marvelous success, though it didn't take long for several mostly fatal flaws to emerge.  Thus, I'm not sure I can recommend this system to others facing similar circumstances.

This system runs through ice very, very quickly.  The air conditioning effect of my system would only last for about 45 to 60 minutes, after which the four to five bags of ice would be reduced to warm cabin temperature brine.  And as it doesn't make sense to break a 13 hour trip into 45 minute ice refilling segments we only had cool air for the first hour of every three or so.   Not to mention that the rate of ice consumption meant the system cost $5-8 an hour to operate, which is just pricey enough to make you think twice.  Worse luck, the fundamental resource without which the entire system wouldn't work (ice!) was magically unavailable at all the highway stops in West Virginia; WV was recovering from a serious storm that knocked out power to tens of thousands of residents who had bought up all the ice to save their refrigerated groceries.  And the final problem was that an automatic transmission oil cooler was not designed to be used as a cold coil for an air conditioner.  The honeycomb lattice of aluminum that does the heat transfer, through which air passes and becomes cold, seemed exactly the wrong size to rid itself of condensation that would form.  Cooling hot oil would create no condensation, but cooling hot air does.  My cooling system was so efficient that within mere seconds all the honeycomb elements of the oil cooler would be plugged up with water causing making the fan to send much of its uncooled air spilling out wherever it could escape the blocked holes.  To keep things working I had to keep running my hand across the face of the cooler to break the surface tension of the water so that it could all run off and allow the fan to work again.  This had to be done every minute or so.  I tried using some fabric to wick away the water from the honeycomb and re-evaporate it, but that didn't work.  And I planned to try introducing a light solution of soap to  the radiator surface to see if that might be enough to let it shed its own water, but I never quite got around to it before we got home.

Journey's End

I think the truck survived the trip better than the humans (and the dog).

On the first day of our return journey we only made it an hour before the ice ran out and the oppressive heat was just too much.  We paused for a few hours in the shade of a tree off the interstate.  Once the afternoon had set in and the ice was refilled we made it only another hour or so before a violent storm came upon us and we took shelter in the lobby of a hotel.  And when the storm lingered we called it a day.  At that rate I began to fear it would be 3-5 days before we'd make it home.

The next day became an unexpectedly long one, and we ended up completing the remaining 640 miles without stopping to sleep.  It was not our choice, however. We had planned to stop four hours away from home, but not only was West Virginia out of ice (because of the aforementioned storm), every hotel was full up. We called more than twenty, all the ones that took dogs and a few that didn't.  In the end we were left with no option but to drive until we reached home.

While I can't claim to be an expert at driving five ton trucks, but the fact that I didn't hit anything, and had no problems surely says something.  I was actually amazed at how well it drove, aside from the miserable uphill speed.  I wasn't able to do more than 30 mph on many of the hills coming back.  Going only 30 mph when other vehicles are doing 75 mph is certainly not an ideal situation, but the advantage of driving through the night was that the bulk of the hills we encountered were climbed when few others were on the road.  The top speed of the vehicle on level ground is only about 57 or so, which meant that in the entire 720 mile trip I don't think I passed a single vehicle.

In the end the toll on man and dog was high.  The stress from the drive left us humans bickering through out the next couple of days, and Osita ended up with a vet trip to treat vomiting and mild dehydration.

But all of us restored, my focus will now be on turning the M820 into a mobile gentleman's study (and my office).

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
Tagged as: , 1 Comment
11Jun/121

The Mobile Gentleman’s Study

I recently bought a 1971 AMC 5 ton Army Surplus M820 Expansible Van.  My goal is to transform it into a mobile office in the style of a Victorian gentleman's study.

See some photos of the truck and some first thoughts about how I might redo the interior of the expansible box below.

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
6Jun/121

YouTube Thinks I’m a Nazi (but I’m not!)

A few weeks back I stumbled across a forum thread on Holocaust Denial.  I'd first read about the topic about 15 years ago when Usenet was the Internet's popular discussion forum.  The years hadn't diminished my fascination with the notion that a militant minority fervently denied events occurred which the majority accepts as wholly factual.  How could there be disagreement about such seemingly self-evident world events (with millions of people involved as witnesses, victims, perpetrators, etc.)?  I'll write more on the topic at some point, perhaps, since I enjoy tracing everyone's ulterior motives and seeing how they influence what should be rational discussion.  But for now I'll just mention the horror that greeted me when I logged back on to YouTube after having watched a series of videos on this topic.  YouTube had apparently decided that I was a neo-Nazi and wanted to helpfully recommend like-minded channels I should subscribe to.  Yikes.

I am pleased, I suppose, that YouTube doesn't play favorites with ideas and allows minority opinions and majority opinions to be heard and subscribed to, but I do wish to god there was a way I could firmly explain to YouTube that interest in a topic does not mean subscription to the idea at the heart of that topic.  As there is none, I'll just have to announce for the benefit of any government, conspiratorial, zionist, etc. agency listening, there has been a terrible misunderstanding, and I am not a Nazi.

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
5Jun/121

Microsoft Leaves .Net 3.5 (and earlier) support out of Windows 8?!?!

In a moment of anything but wisdom Microsoft has decided to leave earlier versions of the .Net (dotnet) Framework out of the Windows 8 install, including only 4 and 4.5. The reason they give for this peculiar decision is their desire to have a smaller OS install footprint. While less disk space lost to an OS install is a very noble goal, I can think of few things worse to leave out. Any user with Windows 8 who subsequently downloads and wants to use an application written against the 3.5 or earlier .Net runtimes will be forced to install (over the 'net) a reboot-required multi-hundred megabyte installer (supporting .Net 3.5, 3.0, and 2.0). Few things deter a potential user of your software more than a lengthy download and a forced reboot.

Adding insult to injury is that I am quite sure their smaller OS footprint goal is little more than an attempt to defend against one of Apple's (and others) easy anti-Windows attacks. Unless Microsoft has radically altered the way they handle Windows Updates, their Driver Store, WinSXS, temporary files, etc. then whatever savings they claim at initial install will be gone in a few months; the Windows directory of my 1.5 year old computer is a whopping 37 GB.

Why couldn't Microsoft leave out MS Paint, MS Write, Solitaire, audio recorder, Pinball, or hell, even Internet Explorer, and include the full range of .Net support? Now us poor developers are going to need to once again need to distribute versions of our software targeting multiple runtimes just to ensure most users don't have to do the absurd .Net installs.

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
29May/123

AutoHotKey versus AutoIt

I've been a huge fan of and user of AutoHotkey (AHK) for years, but I've got to admit (with a sense of betrayal) that I'm increasingly impressed with AutoIt. Last week I had an automation project I had to do and began to code it in AHK only to run into several major roadblocks. For the automation I needed to travel a thirdparty application's tree view UI to find a specific entry and click it. Later in the automation I had to do something similar with a list view control. I had expected to find easy mechanisms or code samples to do it in AHK. To my surprise I found relatively little, the built-in functions related to the GUI creation of those elements not the manipulation of already existing elements. And the little sample code/DLLs I found didn't seem recently updated and didn't work (with AutoHotkey_L). I accidentally stumbled across AutoIt threads on the topic and was pleased to discover it was quite easy with AutoIt, and their official support of those features in their standard include libraries. And thus began my journey into AutoIt.

Here are my impressions:

  • The language syntax of AutoIt is more consistent than AHK, and mostly for that reason I liked it more. When I first started with AHK I found it really confusing that AHK supported multiple distinct paradigms (foo = bar and foo := "bar" as well as the whole Foo(Bar) and Foo, Bar (not to mention Foo Bar, the first comma being optional!?). I still find myself making quite a few typos/errors related to these situations... Forgetting what's a normal function and what's the other style function, putting a := when I meant a =. I'm sure the explanation for all this is historical, but the lingering embrace of all the styles simultaneously is odd (why can't Foo, Bar be called as Foo(Bar) so that people can write to the new paradigm)!! Oh, not to mention the hotkey hooking/specification stuff right there mixed in with regular code, which also confused me.
  • The packaging of the setup/install of AutoIt is impressive, including the SciTe editor, example code, the extended library of functions, x86 and x64 compilers, obfuscator, build tool, auto updater, and more. I haven't installed AHK recently, so maybe AHK does just as complete an install. I was just pleased that in my testing/development I had to set this up on 4 computers and I couldn't have asked for an easier time of it.
  • AutoIt has embeddable compiler and obfuscator directives! You can embed commands in the source that will trigger obfuscation, generation of both x86 and x64 binaries in one compilation run, you can include resources, set the EXE manifest-related data including administrator elevation, PE details, etc. Very nice!
  • AutoIt Help files are almost useless when compared to their AHK counterparts. The index list and the keyword search functions seemed to miss a great deal that should be in their documentation, and it seems as though they do not include many (if not most) of their official support library functions in the help documentation. If you do find the page you need in their docs then everything is okay, they have good examples and references, but I'd swear 60-70% of the time I couldn't find what I needed and had to jump over to their forums or search with Google.
  • The AHK community is absolutely amazing, and it would be hard to top them in terms of friendliness, helpfulness, knowledge, code-sharing, etc. I have only been an observer on the AutoIt boards as I looked for other people's solutions, and so perhaps my observation is meaningless, but I saw more grumpy unfriendliness towards newbies than I'd remembered seeing on the AHK boards. (I'm not saying the AutoIt community isn't great, too, it probably is, it just might be a little less tolerant of newbies and their poorly researched questions.)
  • AutoHotKey automatically handles most UI interaction logic for you (via gGotoLabelName calling identifiers in the various GUI element creation functions) whereas AutoIt requires you to create your own windows message processing loop with switch/select message to handle every interaction to which you want to respond.
  • As mentioned earlier there's a distribution-included obfuscator, which seems pretty good. The quasi-lack of one with AHK has been an annoyance of mine; AHK_L doesn't do the password thing any more, and I never had much luck with Hotkey-Camo or anything else.
  • I was impressed with how quickly I was able to jump right into AutoIt using my AHK knowledge. I imagine it'd be harder coming the other way, because of the unusual multi-paradigm AHK language thing. Both languages are remarkably similar in their use, with many functions being identical in name and use. Example: Send, Foo in AHK is Send("Foo") in AutoIt. Within a few hours I was able to automate a relatively complicated and branched Windows dialog flow (related to driver installation, involving tree view navigation, list view navigation, support for different scenarios on different versions of Windows, etc.).

In no way am I concluding that AutoIt is better than AutoHotkey, nor can I conclude the opposite. My love of AutoHotkey isn't wavering, but I am glad AutoIt was there for a task which seemed like it would have been harder for me to do in AHK with the existing public code. So if you ever find yourself in a similar situation you needn't feel shy about trying out AutoIt.

Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
18May/120

How to Fix Your HP Support Assistant Install

This morning my HP laptop strongly suggested I upgrade its included support software today (the HP Support Assistant).  I foolishly accepted its offer and spent the next four hours ruing that decision, and trying to correct the damage it did.  The upgrade somehow screwed itself up (rendering the HP Support Assistant broken) and along the way also screwed up my Visual Basic scripting support.  I found numerous links which talked about related problems but none that fixed mine.  Most solutions revolved around removing the registry keys for the VB scripting DLL and then re-registering the DLL.  For some reason re-registering the DLLs didn't seem to work for me, despite running the command shell elevated.

Ultimately I exported the registry entries from another working Windows 7 x64 computer and merged them on the ailing laptop.  I then uninstalled the HP Software Framework and the HP Software Assistant and then reinstalled them in that same order.  And voila, at long last everything worked.

For anyone who needs them, here are the registry keys in question for fixing your VBscript install on Windows 7 x64: HP Support Assistant VB Scripting Registry Fix .

Included are registry keys (and DLLs) related to the 32 bit support and the 64 bit support.  All you need to do is merge (by opening) the 5 registry keys included (the five .reg files in the two directories).  I include the DLLs just for reference, in case your installation has a damaged DLL.

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
30Mar/121

Cell Phones as R/C Quadcopter FPV TX/RX and Homing Beacon

Flying multi-rotor (particularly quadcopter) radio-controlled vehicles is a lot of fun and you can do amazing things with them, in particular some beautiful aerial photography.  While most r/c pilots do this by looking up at their craft from whatever their distance happens to be at the moment, a growing number of r/c enthusiasts are using FPV (first person view) to remotely control their vehicles.  By using a tiny video camera with an attached transmitter the pilot can virtually fly their quadcopter as though they were a miniature pilot located inside it.  Aside from just being a lot of fun, this perspective makes it possible to fly over far longer distances than one could by merely looking at the craft from the ground.  The equipment to do FPV is not cheap, however, with decent entry-level setups of camera, transmitter, and receiver costing $1,000.  And so I couldn't help but wonder why no one talks about using the ubiquitous smart cell phone with its included cameras as an alternative solution.  The cell phone has a number of advantages over a typical FPV setup, namely that with the right apps running it can record its own video, it can operate over almost infinite distance with its use of cellular networks for data transmission, it can operate with very high bandwidth over 4G or wifi (though very limited distance with wifi), it can log its own flight path by recording its GPS positions, and if it crashes it can transmit its location to make recovery easy.  With all these advantages in a package that can cost just $100 it's hard to imagine it not used by everyone!

From what I understand in talking to a few people, the issue comes down to video quality, latency, and the possibility that the connection just drops out.  Few people want to risk their $1,500 and up r/c darling on Sprint's or AT&T's potentially spotty and variably efficient cell coverage.  And the current video streaming software available as apps for the key FPV feature are not intended for mission critical, near real time transmission.  Imagine trying to drive a car or pilot a plane with Skype.  You might do just fine for a while, as long as sender and receiver have good signals but if either gets into trouble the video suddenly becomes erratic, delayed, and pictured objects become indistinct and it would be impossible to make critical operating decisions based on that.

Given the enormous complexities being overcome daily by dedicated enthusiasts of r/c flying this seems like a challenge that can fairly easily be overcome.  One of the key pieces of software this community has developed and continually refined is the "flight controller", or the software which takes signals from the pilot's transmitter (the device with the joysticks that he uses to control the vehicle) and turns those into adjustments to motors and control surfaces.  Many flight controllers now even come with amazingly sophisticated auto pilot features, like the ability to hover motionless in one spot (despite winds, etc.), to fly home in the event that signal is lost or a fault develops, to automatically land if batteries are low, and even to navigate on its own, flying between coordinates previously supplied to it.  If all that can be achieved surely the FPV via cell phone problem can be overcome!

There are three problems that need to be addressed: loss of signal, degradation of signal, and overall quality of video.

Loss of video signal is clearly a very real problem and no amount of clever software can make up for a lack of inputs from the pilot, but such situations can be appropriately handled to minimize negative impact.  As I mentioned above, many flight controllers now include safety modes such as hover,  automatic return home, and automatic landing.  The pilot with such an onboard autopilot can at the flick of a switch tell his craft to do the appropriate thing, presumably either hovering to wait and see if signal is restored or begin to return home at least enough to regain the signal.  Also, the FPV app in the cell phone can be optimized to recover quickly from any network failure (once the underlying data connection is restored).  And if this was to all develop and become more sophisticated clearly an integration of the FPV app and the onboard flight controller would be ideal, allowing the FPV app to command auto pilot features directly in the event of network loss as well as be the means of transmitting flight controller telemetry to the ground during routine flight (and perhaps controlling other features of the flight controller as well).

Degradation of signal and overall quality of video are related problems.  The key here is, I believe, to develop a video codec or perhaps just an application of existing codecs that focuses on the critical visual data FPV pilots need.  While users of video streaming applications like Skype want overall picture quality to be good, an FPV pilot is primarily focused on visual information related to the orientation of their craft relative to ground, potential physical obstacles in their path, and anything necessary to continue whatever flight motion they were executing.  If the signal degrades and there is less bandwidth over which to send video it's most important that the critical parts of the pilot's picture continue to appear in near real time!  How exactly one extracts or prioritizes those features I am not sure at this point, but I am fairly certain it's an achievable goal.  One need only think of situations they have been in when their own vision has degraded due to environmental factors like darkness, fog, smoke, etc. to realize that our brains can seize upon very small and sometimes indistinct cues to maintain orientation.  An algorithm could be developed to prioritize the sending of lesser quality video data related to horizon and to nearby obstacles (elements of the frame which the algorithm have noted move more relative to the overall background).  And with the ongoing development in the areas of computers interpreting images to extract features like faces, eye positions, smiles, body positions, etc. it should present little challenge to have the FPV app be able to maintain an awareness of very crude items like the horizon and those objects most likely to represent near obstacles.  This specific data could even be transmitted using ultra-low bandwidth as mere vector data rather than actual color images, in other words allowing the second receiving cell phone app to reconstruct the approximate figures overlapping whatever video may or may not be coming in.

Hopefully these things will see development in the near future because it can hardly be argued that the potential here is huge.

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
27Mar/122

The Mighty Blade mQX Quadcopter

I recently bought the Blade mQX micro quadcopter and it is absolutely amazing!  If you think it looks fun you'd be a fool not to buy one and see just how much. For $129 (Bind-n-Fly) or $169 (Ready to Fly) you just can't go wrong.

The mQX is super responsive, acrobatic, and (near as I can tell) indestructible.

This is my very first quad and very first "real" r/c anything; I've only owned two or three of those cheap, indestructible indoor foam helicopters. Stepping up to the mQX was definitely not easy, but neither was it too great a step to make. If I can do it, you can do it too! But I would ***strongly*** recommend buying the Phoenix R/C simulator and flying the Gaui 330X model (that's the only simulator I've seen with a quad model in it). I spent hours flying the Gaui in the balloon popping trainer of the Phoenix simulator and it made a profound difference on my flying ability. I had serious doubts that a simulator could help with real world flying, especially given that I was flying a non-mQX model, but was happily proven completely wrong. My real world flying made a quantum leap after a few nights of simulator flying

One of the most impressive things about the mQX is how inconceivably strong it is. In my early flights especially I would get disoriented and send the mQX screaming into the ground and it sustained absolutely no lasting damage (the canopy was destroyed and I had to bend a propeller blade back to straight a couple times but that's it).

And, coolest of all, the mQX is powerful enough to mount any of the tiny 808 cameras. I put a jumbo keychain 1080p camera on it and it seemed to fly as responsively as it always did, and for seemingly almost as long as it always did. It amazes me that I could have a HD aerial camera platform for under $200. (Obviously it's no $1,000 stabilized system, but on a very calm day with a good pilot consciously flying for a particular shot, it's stunning!)

I don't usually bother to evangelize a product, but this one deserves it!

If you're interested, check out my beginner's guide to flying r/c helicopters, planes, or quadcopters.

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex
14Mar/1210

Lytro Alternative: Automatic, Intelligent Focus Bracketing

Now in my early post-Lytro days I've been wondering how I could achieve the same effect with better results, not wanting to wait the years it might take for them to come up with a suitable next generation model.  Lytro's only real selling point at this moment is it's ability to take "living pictures" (their parlance), which really just means a photo which is interactive in as much as you can focus on different items in the picture by tapping those items.  The technology may be capable of quite a bit more, but that's all it currently delivers, and it delivers that with poor resolution, graininess, and restrictive requirements on lighting/action.

Living Pictures without Lytro

Why couldn't I achieve the exact same effect with far better results using my existing digital camera? I could, and did! Here's my "living picture" proof, using just an ordinary digital camera and a bit of human assistance.

No Lytro was required for this "living picture", just an ordinary digital camera (in this case a Sony NEX 5).  Click on different objects in this scene to change focus depth!

...and now for Lytro's version...

Lytro's "living picture"!  Click on different objects in this scene to change focus depth!

It doesn't take an expert in photo analysis to see that the non-Lytro picture looks much better: sharper, higher resolution, less grainy, and more realistic colors.

Faking Lytro Manually

The simple fact is, if you can capture a succession of photos each at a different focal setting and then view these photos with a JavaScript or Flash program that responds to clicks by selecting the photos whose clicked region was in focus, you can create the pleasing "living picture" effect without any fancy light-field camera required. And that is just what I did for the above demo. I took a series of pictures focusing on the different major elements in the scene and then made the regions in the photos clickable by way of a simple image map which specifies which image in my sequence corresponds to which area. While the approach I took for the demo was manual, it's not hard to see how easily any camera in combination with some very simple software could be made to do this, producing final results better than is possible with the Lytro camera.

Making Cameras Support Lytro-Like Effects

Many cameras these days have a feature called "exposure bracketing" which takes reacts to a shutter button press by taking a series of pictures at different exposure settings.  You then review the photos later and determine which photo looked the best.  Why then could you not have a "focus bracketing" feature which does the same thing but with focus?  The simplest approach would be to take multiple photos as the camera automatically walks the focus back from infinity to macro taking as many photos as necessary to achieve a desirable effect, perhaps as few as 5 or 10 would be needed to achieve a reasonable effect; with the aperture appropriately set any given picture's depth of field is wide enough to allow significant ranges to be sharply focused.  You would then need some mechanism for assigning clickable regions to the photo frames which happened to be in focus in that region.  This would likely be a fairly trivial software problem to solve.  All of this could be done with minimal camera intelligence, since it would just be varying focus distance in a fixed manner.  A far better but slightly more complicated solution would be to do automatic, intelligent focus bracketing using the camera's built-in autofocus system. Many cameras (particularly in phones) allow you to select a region of the scene which should be in focus. It would thus be easy for the camera to break down the scene into a search grid it would scan looking for objects upon which to focus, taking a single picture at each focus depth (one photo per range, according to the depth of field). The camera could record which grid location contained an object at a certain focal distance away, this being usable later to relate clicks on an image to a particular focused frame. The advantage of this approach is that it might be far quicker and more efficient, needing only as many frames as a scene objects' depths require. A scene which had two people in the foreground hugging and a church in the background would probably require just two photos to make a "living picture", people around a table at a birthday table with a cake in the center and a bounce house in the background may require 5 or 6 photos to make a Lytro-like image.

Working with Motion

These approaches share one significant weakness which is the fact that using multiple sequentially taken images negates the ability to capture any action-oriented scenes.  While modern digital cameras take rapid-fire photos, and those with exposure bracketing take three or so shots in a half second, that's certainly slow enough to make any significant movement within the scene noticeable when switching between frames.  Still, as action is easily blurred with the first generation Lytro, this hardly seems any sort of argument against this alternate approach.  An interesting solution to this problem and that of the inability to easily alter most existing camera's firmware, would be to use a replacement lens that split a single digital frame into multiple differently focused reproductions of the scene.  Just as I use a Loreo 3-D lens to merge the images captured by two lenses onto one digital frame, so to could one produce a system that would use four or perhaps nine lenses to capture one instant onto one digital frame through small lenses focused at slightly different depths.  Software could then easily split apart the single digital image into its component frames and do an easy focus analysis to determine what regions in each were in focus, with viewer software showing those as appropriate in response to touch.  The limitations of this approach would be related to the increased lighting requirements (or decreased action) as a result of the smaller lens, the expected poorer quality of each lens (related to cost and it being more a novelty manufacture than embraced by lens giants), and the reduced resolution (as your effective megapixel image would be the original value divided by the number of lenses within the lens).  Many stereo photographer setups coordinate two cameras to take their photos in concert, getting around all these issues, which you could also do to solve this problem, though I can imagine nothing more cumbersome.


The Lomo Oktomat as seen on Lomography has eight lenses which it uses to 2.5 seconds of motion across a single analog film frame it has divided into 8 regions. The same multi-lens approach but used simultaneously, with each lens focused slightly differently, could capture motion with Lytro-like aesthetics.

Focus Bracketing leads to Focus Stacking (Hyperfocus)

As I began to look into the practicality of these approaches I was pleased to discover that "focus bracketing" was being done manually, though with an intriguingly different goal.  Rather than produce a living picture where you can focus on different elements in a scene, a process called focus stacking is used to take and then (using software) merge photos taken at different focus settings to produce a single image in which everything in the scene is in focus.  The software involved analyses each photograph in the stack, each of the identical scene where only the focus is varied, and uses the regions of each photo which are in focus to produce the combined image in which everything is in focus.   This approach produces very impressive results.  The only limitations to this system is the requirement for a still scene, and the strong recommendation (if not requirement) that you use a tripod when taking your shots so as little varies as possible.  


Series of images demonstrating a 6 image focus bracket of a Tachinid fly. First two images illustrate typical DOF of a single image at f/10 while the third image is the composite of 6 images. From Focus Stacking entry on Wikipedia.

The aesthetic of a photo in which most things are in focus is quiet different from one in which only those things you select are in focus, but from a technical standpoint they are quite similar, since both situations require one possess the data pertaining to every element in a scene being in focus.  And a viewer could (and likely would) be given the option of viewing such a photo as he/she wished.  Do they want to see the photo traditionally (one, non-interactive focus point), as a "living picture" where they can choose the object in focus, or as a photo in which everything is in focus?

Focus Bracketing Available Today on your Canon PowerShot

A little further research led me to find a rather intriguing ability to add automatic focus bracketing to an entire range of camera models, via the Canon Hack Development Kit (CHDK).  CHDK allows you to safely, temporarily use a highly configurable and extensible alternative  firmware in your Canon PowerShot.  And users have used this to add focus bracketing for the purposes of focus stacking, and included detailed instructions on just how you can do it, too.

Coming Soon as an iPhone & Android Camera App

This integration of camera and software is a natural fit for an iPhone and Android app, where the app can control the capturing of the image and intelligent variation of focus and then do the simple post-processing to make the image click-focus-able. While I haven't seen such an app, I'm sure it'll come soon. I'd write it myself if I had the time.

Until the Future Comes

The point is that until Lytro demonstrates just what can be done with a light field camera, beyond merely creating a low-resolution "living picture", there's really no technical justification for placing the technology in people's hands when the same problem could be solved as effectively with traditional digital cameras.  If demand existed (and perhaps it will come) for this image experience, no light fields need apply.  Hopefully traditional digital camera companies will see the aesthetic value and include support in their firmware (for intelligent focus bracketing) and co-ordinated desktop software, app developers will launch good living picture capturing app cameras, and hopefully Lytro will demonstrate the additional merits of capturing and reproducing images from light fields.

^ Quinxy

Digiprove sealCopyright protected by Digiprove © 2012 Quinxy von Besiex