NSA Snoops

June 27, 2013

This whole NSA melodrama is rather interesting. Once again a disgruntled government employee has leaked juicy top secret documents, and the fallout has divided politicians roughly along that quirky libertarian/authoritarian axis. Meanwhile, the public is freaking out because privacy online is an illusion, and Big Brother has been watching all along. George Orwell was right! We didn’t listen!

Along with several other people, I’m not exactly surprised by this revelation. What does the NSA even do if not this? How else does a company like Palantir make money? Are people actually surprised that planning crimes on Facebook is not a solid business plan?

Less obviously, why would anyone expect an ad-supported media company to stand up for privacy rights or whatever the libertarians want? People are acting like this is the cyberspace equivalent of strong-arming banks in order to search everyone’s safety deposit boxes. I’d suggest that it’s closer to reading people’s mail and rooting through their garbage. If someone invented a cheap stealth robot to do exactly that, do you really think the NSA wouldn’t be their top customer?

Signals intelligence is a sketchy business, whether or not you agree with the tactics. Most contractors who are not fleeing the country probably figured that out a long time ago.

Hobgoblins

March 16, 2013

Here’s a quote from H.L. Mencken that caught my attention:

The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.

I guess I’ve always been a bit amused by hobgoblins, having seen quite a lot of them influence American pop culture firsthand. This isn’t to say that none of these scare stories have any basis in reality, or that I have always seen through the hype. However, it will be fun to maintain an official List of Hobgoblins that this millenial remembers. Here it is:

Nuclear Winter, Pollution, Japanese Manufacturing, Ozone Depletion, Saddam Hussein, Ecological Collapse, Skynet, Global Warming, Y2K, Terrorists, SARS, Russian Hackers, Nazi George Bush, Mexicans, Illuminati Bankers, Genetically Modified Food, Hyperinflation, Climate Change, Nazi Barack Obama, Ocean Acidification, Chinese Hackers, Nuclear Meltdowns, Bath Salts, Climate Disruption, Planet X, Budget Cuts…

The list approximates a chronology, and it will be updated.

The Pixar Princess

February 14, 2013

A surprising number of animated classics have been created during my lifetime. Two important revolutions happened in America at the end of the 20th century: the “Disney Renaissance” and the arrival of computer-based animation. While the explosive impact of 3D was hard to miss, another important factor was subtle and sociopolitical.

The Little Mermaid was a regression in one sense, to the animated Broadway-musical royal romances that defined early Disney animation. In another sense, it broke new ground for the format. Ariel is the titular 19th-century ocean-dwelling heroine who makes a faustian deal to win the heart of Prince Eric, a boring but beautiful human. As only the fourth official Disney Princess, she does something other than be in distress, which is notable. The box office returns were impressive, and the royal romance was back in a big way. However, that whole singing princess trope hasn’t transcended its gendered appeal to this day. Consider that Pixar out-earned Disney at the box office without featuring a princess for 17 years. Or consider Halloween costume sales.

Anyway, let’s continue. Belle is also an “interesting princess” with all the book learning and such, but her character has more depth in that she learns to love a hideous creature. There’s not much else that needs to be said here, but imagine if the genders were switched!

Jasmine is the bride of Aladdin. She is notable for being non-European and owning a tiger. Perhaps because boys would balk at Beauty and the Beast, the action in that movie revolves around the illegitimate title prince. The Lion King doesn’t actually marry an official Disney Princess, which is just as well because lions are polygynous cannibals.

Pocahontas is next, featuring a quasi-historical (Native) American. The wheels were coming off the royal romance gravy train by this time, and her movie was slightly overshadowed by Pixar’s explosive debut. Animation would be changed forever; 2D was suddenly old-fashioned and unmarketable (see The Iron Giant). While its shiny new rendering process got a lot of attention, Toy Story was also at the vanguard of a different narrative technique. Gone were the musical numbers and pretty princesses – the only romance in Toy Story is between Woody and Bo Peep, and the framing device literally casts them as role-playing toys.

That stroke of genius allowed the franchise to explore mature themes like jealousy, existential angst, and the social contract, while basing all the action around a child and his possessions. Perhaps there is some significance to Andy’s gender and the fact that his pretend play always involves aggressive conflict between good and evil. The neighbor Sid takes this to a perverted extreme, obliterating and mutilating toys, while his sister Hannah has them share idle gossip over tea.

In any case, Pixar’s movies have avoided the royal romance trope almost entirely. Shrek absolutely wallowed in fairy-tale nonsense, and eventually The Princess and the Frog and Tangled introduced Tiana and Rapunzel as the first Black and 3D Disney Princesses, respectively. Meanwhile, Finding Nemo celebrated guardianship and adventure, The Incredibles focused on ability and the family unit, and Ratatouille studied ambition and creative expression. The latter did have a romantic subplot and a peerage aspect which was subversive at best.

To get to the point, Merida the Brave is scheduled to become the first official Pixar Princess in July. This is interesting for several reasons: First, she stays single until the end of the movie! Her three would-be suitors are not very charming, to say the least. Second, she doesn’t sing, except for one Gaelic lullaby. Finally, Merida isn’t actually the first romantic female lead in a Pixar movie. That honor goes to EVE, a machine with a fusion reactor and a minimalist design supervised by Jony Ive. Clever.

I’m leaving out several other animated features of note, like Wallace and Gromit, Persepolis, and Coraline, and that isn’t even mentioning Miyazaki or the rest of Japan. Here’s to all these great artists, and congratulations to Princess Merida!

The Touchscreen Paradigm

January 21, 2013

Programming is evolving faster than ever. In recent years, mobile platforms have broken the software market wide open, and most implications of this disruption are yet to be discovered. However, some effects are already obvious. Software has transcended the limitations of mouse/keyboard/gamepad input, since mobile devices integrate touchscreens with cameras, microphones, speakers, and wireless connections. I call this “the touchscreen paradigm” but it refers to all of those now-standard inputs and outputs.

This hardware generalizes to an unprecedented number of applications. A typing keyboard can be simulated by key images on the touchscreen, although that experience has decidedly inferior ergonomics. Mouse clicks are replaced by touchscreen taps, and while this system has no provision for “hover” interactions, other forms of mouse control are improved. Drawing is very awkward with a mouse, since the brain has to map the mousepad surface to the display in real-time. Touchscreens eliminate this problem, and in fact they are functionally similar to high-end drawing tablets with integrated screens that have been available for some time. Wacom, a manufacturer of these computer accessories, now sells a high-end stylus that integrates with mobile software.

Other applications go beyond anything that is possible with a mouse and keyboard. Multiple finger touches can be processed at once, making “Minority Report” interfaces easy to build in software. Microsoft put significant capital into a tabletop touchscreen computer called the Surface Table (re-branded as the PixelSense). However, similar interfaces can be added to mobile devices with software, such as this Photo Table application. Fortunately for independent developers, the barriers to entry in mobile software are very low because standard hardware already exists.

These examples barely begin to fill the space of possible touchscreen applications. My phone is already a feature-rich camera, an “FM” radio, a guitar tuner, an SSH client, and a flashlight. Those products have been manufactured before with dedicated hardware, but mobile software is also being used to invent completely new technology. Products which require a touchscreen, audio/video input/output, or an internet connection can be built entirely in software and sold as mobile applications.

As a software engineer, this is obviously a good thing. However, the sheer number of new applications that are possible on mobile platforms presents an intimidating problem: what to build next? Customers might be able to describe what they’d pay for, but they don’t always know what they’ll want to buy in the future. The first generation of application programmers probably experienced a similar feeling. It’s inspiring and terrifying at the same time.

Non-Player Characters

December 24, 2012

Here’s an interesting idea. This article mentions “non-player characters” in the context of a role-playing game, and proposes something rather unsettling:

Many of us approach the other people in our lives as NPCs.

I’ve been thinking along similar lines. People often imagine strangers as incidental scenery, part of the social environment. This is understandable given the limits of human knowledge – there simply isn’t enough time or mental capacity to understand very much about very many people. However, we often forget that this perspective is only a necessary convenience that allows us to function as individuals. For example, if you’ve ever been in a rush to get somewhere on public transportation, you’ve probably felt that bit of guilty disappointment while waiting to accommodate a wheelchair-bound passenger. Here comes some person into my environment to take another minute of my time, right? If you use a wheelchair yourself, this delay happens every time you catch a ride, and that frustration simply does not exist. If anything, I would imagine that disabled passengers feel self-conscious every time they are in a situation where their disability affects other peoples’ lives, even in an insignificant way.

Has this always been true? Probably to some degree, but the modern media environment seems to especially promote it. Good fiction writing communicates the thoughts and motivations of relevant characters, unless they are complete unknowns. This means that any meaningfully observable character has some kind of hypothesized history and experience informing their participation in the story. Film is different, in that a script can describe an “evening crowd” in two words, but the realization of that idea can involve hundreds of extras, living entire lives and working day jobs that barely relate to their final appearance on the screen. We can assume that their real lives intersected with the production of that scene on that day, but it’s really the only significance that their identities have in context.

With interactive media, the idea of a “non-player character” has appeared in many forms, and academics study how they can design the best (read: most believable) fictional characters for interactive environments. Here the limited reality of these characters is even more pronounced. In video games, non-player characters have lower polygon counts, fewer animations, and generally use less code and data. This is a consequence of the limited resources available for building a virtual environment, but the effect is readily apparent and forced.

Does this mean video games shouldn’t include background characters? Not really. What I’m suggesting is that we should be careful to see this phenomenon for what it is: an information bias in favor of the protagonist, which necessarily happens while producing media. It shouldn’t ever be mistaken for a relevant characteristic of the real world. This holiday season, when you’re waiting an extra minute or two for a disabled stranger, or expecting better service from a tired professional, remember that he or she probably has lived a life as rich and complicated as your own, and try not to react as if he or she is just some kind of annoying scenery. Whoever it is might return the favor, even if you never realize it.

Global Whining

November 28, 2012

The scientific method is the greatest invention of the modern age. For centuries, its practitioners have transformed civilization using rational systems revealed through careful observation. Theories which have succeeded not by virtue of their popularity but because they correctly predicted unknown phenomena are especially awe-inspiring. However, predictions without rational justification or those vague enough to be confirmed by any number of observations should not earn the same recognition. I’m a huge fan of Karl Popper regarding falsification, the idea that a scientific theory must predict some observable event which would prove it wrong. This principle eliminates uncertainty regarding how specific a valid theory must be. Unfortunately, it has been ignored by some academics who claim to be scientists so that people won’t laugh at their ideas. You might have already guessed that today I’m targeting the low-hanging fruit of global warming alarmism. Prepare to be offended.

I won’t waste your attention picking apart the various temperature series, criticizing the IPCC models, or citing evidence of misconduct, not because those arguments have already been made by more qualified individuals, but because they shouldn’t even be necessary. Fundamental problems with any apocalyptic hypothesis make the whole enterprise seem ridiculous. This is what Popper says about scientific theory:

1) It is easy to obtain confirmations, or verifications, for nearly every theory – if we look for confirmations.
2) Confirmations should count only if they are the result of risky predictions; that is to say, if, unenlightened by the theory in question, we should have expected […] an event which would have refuted the theory.
3) Every “good” scientific theory is a prohibition: it forbids certain things to happen. The more a theory forbids, the better it is.
4) A theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice.
5) Every genuine test of a theory is an attempt to falsify it, or to refute it. Testability is falsifiability; but there are degrees of testability: some theories are more testable, more exposed to refutation, than others; they take, as it were, greater risks.

The scenarios published by climate modelers don’t qualify as scientific predictions because there is no way to falsify them – updated temperature measurements will inevitably correlate with some projections better than others. And fitting curves to historical data isn’t a valid method for predicting the future. Will the IPCC declare the CO2-H2O-feedback warming model invalid and disband if the trend in the last decade of HadCRUT3 data continues for another decade or two? How about if the Arctic ice cap survives the Summer of 2014? I’m supposed to trust these academics and politicians with billions of public dollars, before their vague predictions can be tested, because the global warming apocalypse they describe sounds more expensive? This riotous laughter isn’t meant to be insulting, we all say stupid things now and then.

Doesn’t the arrival of ScaryStorm Sandy confirm our worst environmental fears? Not if we’re still talking about Karl Popper’s science. Enlightened by the theory of Catastrophic Anthropogenic Global Warming, academics were reluctant to blame mankind for “exceptional events” only two months ago. They probably didn’t expect a hurricane to go sub-tropical and converge with a cold front as it hit New York Bight on the full moon at high tide less than six weeks later, because that kind of thing doesn’t happen very often. Informed news readers might have been expecting some coy suggestion that global warming “influences” weather systems in the rush to capitalize on this disaster. But in a caricature of sensationalism, Bloomberg splashes “IT’S GLOBAL WARMING, STUPID” across a bright orange magazine cover, and suddenly enormous storm surges threaten our infrastructure again while the seas are still rising slowly and inevitably, all because of those dirty fossil fuels.

I don’t mean to say that we should actually expect scientific integrity from a stockbrokers’ tabloid, but Mike Bloomberg has really sunk to a new low. He spoke at TechCrunch Disrupt a few years ago and seemed like an average business-friendly mayor, not a shameless propagandist. I guess the soda ban was a bad omen. It’s a bit discouraging to see another newspaper endorse the panic, but then the organizers of our climate crusade have been pushing their statist agenda on broadcasters for a long time.

On Sunday, the New York Times doubled down with this ridiculous melodramatic lament, written by one talented liberal artist. Where’s a prediction about the next exceptional event, folks? Is it going to be a tornado or earthquake, or does it matter? Are there actually any rules for preaching obnoxious hindsight to believers? Can anyone suggest an observation that would falsify the theory?

What will the temperature anomaly or the concentration of carbon dioxide measure in ten years? How about one solid date for the eradication of a low-lying coastal city? If you must predict the apocalypse, it is only somewhat scientific if you can rationally argue for a deadline. And the science is only settled when the world ends (or doesn’t end) on time.

Plus ça change, plus c’est la même chose. Happy 2012.

Singing

October 9, 2012

If you like music at all, take some advice from me and sing whenever you have a chance. It’s good for your soul, whatever that means. I’ve played music for many years, but I recently discovered how much singing affects my mental health. Live shows don’t count, because the crowd and the amplifiers drown you out. Sing when you’re alone in a quiet room, and fill it with your voice. Sing whenever you’re driving alone. Sing at work, as long as you aren’t disturbing anyone who can punish you. Just sing – you can probably play your own vocal cords even if you can’t play another instrument. If you sound awful, don’t worry about it. Nobody is going to harass you for singing poorly in private. Many people are too insecure to try it at all.

I’m saying this because something wonderful happens when we hear ourselves sing. It helps the brain. Matching pitch forces us to listen to the sounds that we produce, and quite possibly to produce sounds that we are not entirely comfortable hearing. This is a good thing! It teaches us to be more comfortable whenever we use our voices, and more confident in general. The sooner you get accustomed to hearing yourself make unfamiliar noises, the sooner you will be able to make noise in public with absolute confidence. Sometimes, that is absolutely necessary. Why not be prepared?

The Value of Education

September 18, 2012

Last week I wrote an aggressive piece criticizing non-technical education. I’m not backing down from that stance, but as my sister is currently studying for her MBA, I’d like to offer some reassurance to anyone who might have been intimidated by those words. I wasn’t trying to say that the Business School isn’t valuable or necessary. The piece was intended as a warning to anyone with misplaced confidence in accreditation. I’ve seen far too many graduates with egos disproportionate to their actual value in modern society. The world is changing very rapidly, and anyone clinging to outdated definitions of achievement and success will not make it very far in the future.

My advice is radical: Disregard the metrics that your teachers provide, and come up with your own as soon as possible. If you’re dutifully completing your schoolwork to earn a degree that will make someone respect you or be proud of you, don’t abandon that. However, don’t let yourself think that those credentials will matter to everyone you meet in the professional world. Nobody seems to care where I went to school anymore. Having a degree got me to this point, so I can’t disregard its importance, but at the same time my degree doesn’t guarantee me very much money, power, or prestige.

From Latin, education is literally the process by which a teacher “leads out” knowledge from a student. We haven’t figured out a better way to motivate young children, but there comes a point in every student’s life where knowledge must be willfully sought. Unless you’re enrolled in a trade school, that point is now before the end of college. Your professors are trying to help you, but they can only grant you legal accreditation, which is necessarily someone else’s definition of success. That’s not enough anymore. Even if you’re going for your PhD, it won’t bring you a good job that doesn’t require advanced mathematics. The harsh reality is that you will almost certainly have to work towards excellence on your own. Start now, before it’s too late.

The Programmer and The MBA

September 11, 2012

I’d like to write about higher education today. My BS is in Multidisciplinary Studies, a custom blend of classes that I found interesting enough to finish. I received it from RIT after failing out of one college for lack of motivation. In fact, I had earned a full ride at Fordham University, and had selfishly ignored my homework to spend quality time in the greatest city on Earth.

I’m lucky enough to have studied CS at Fordham, and while I was never officially enrolled in a computer class at RIT, the name is associated with technical excellence so my degree ended up being very impressive on paper. I don’t mean to say that I can’t program computers – I’m an extremely gifted engineer and I know Xcode as well as the next startup founder. However, my course of study included media and entrepreneurship classes, so I have a unique perspective on the Business School vs. Engineering School argument currently raging across this country.

As you might have guessed, I’m on the side of the engineers. I’ve always been a bit of a nerd, so of course I have sympathy for other introverts, but that is not my reason for joining this argument right now. I’m blogging today because computer nerds know how to make software, they tend to get exploited by people who don’t, and it is ruining America for everyone. Our economy is completely dependent on computers at this point, and I am astonished that so many of our “leaders” still haven’t figured out how to use Microsoft Outlook.

The members of the Business School claim that they’re learning how to work with other people and building skills that are needed by large companies. I’m skeptical about this. A company won’t grow large enough for these skills to matter until it has built something profitable. An MBA assumes that large companies already exist and demand specialized management skills. It is only useful as long as those assumptions are true.

Unfortunately, another problem is not affected by economic conditions. Computers follow instructions, but humans do not. A student can’t really learn how to manage people by listening to another person. The student must discover those skills by herself. Don’t forget, especially if you’re training to be a manager in the Ivy League. Every empire ends.

Interestingly, and almost right on cue, the free police have descended on Dalton Caldwell for following through with something as original and outside-the-box as his audacious proposal, the supremely interesting app.net alpha community (and associated API). These doubters are mostly, I don’t know, anti-technology-business-experiments or something, and I’ve decided that they simply do not understand its significance yet.

If I may hazard a guess, what these late adopters aren’t grasping is the fact that right now, app.net is a lot more expensive than the sticker price seems to suggest. The cheaper options are still closer to $1000 on the cost side for I’d say the dominant majority of its users. Do you know why? Most of them seem like they’re busy hustling some kind of profit that they can live off of, and even participating in this network, to say nothing of developing for it, is an enormous time investment! We’re putting our money down to give Dalton food and motivation, because he’s convinced us that 50 bucks a year does not matter anymore for a network with as much potential as this. I’d pay twice as much every year just for the publishing functionality, regardless of how many users stick around, and especially if they maintain this superb commitment to the product. Think about what you are actually using your money for, people! This is starting to look ridiculous – I think it’s rather undeniable that Dalton has touched one heck of a nerve here!

There is only one way that the app.net community can ever shake off the rather myopic “elitist Twitter” label that naysayers seem to be gravitating towards, and start something that I think a lot of people around the world will want. That is to prove to them what a network of interested parties can do. To that effect, I’m working on a new Chrome extension called AppAnnotate that uses the app.net API to let people annotate any web page and share notes with friends. The way of the future!