- Invoke Alfred
- Start typing
wolf...
to bring up the Wolfram workflow - Paste LaTeX code and
return
- Home: My home network, including Wi-Fi and ethernet connections
- Public Wi-Fi: Any open or public network I join
- LTE: Any network created by my iPhone or iPad’s hotspot feature
To be clear, this is not a criticism of Blackblaze. I HIGHLY recommend using Backblaze for bulk online backup. We just live at a time when mobile data isn’t yet a fully capable peer (cost-wise) compared to non-mobile networks. ↩
To a lesser extent, Dropbox can also be a data hog, so I usually block Dropbox by default and allow it on an exception basis. Most of my Dropbox work involves text files, so the bandwidth is pretty low. However, I’ve taken data hits in the past when someone else added a large amount of data to a shared folder and Dropbox synced the data while I was connected to LTE. With Little Snitch always watching, there’s no need to worry about these unforeseen data traffic bursts. ↩
-
On Scene: The Journal of U.S. Coast Guard Search and Rescue (Fall 2006) ↩
-
I don’t mean to literally suggest that Shakespeare suggested the notion of “instinctive drowning response” several hundred years before Frank Pia, but the symbolism of that age here is too striking not to pass on. ↩
-
I feel compelled to mention that I first encountered the notion of “psychological time” in Eckhart Tolle’s book The Power of Now, which I half-heartedly recommend reading. “Half-heartedly” because it is very spiritual in nature, and I’m no evangelist. Tolle’s dichotomy is useful as a mental construct, though. His premise is that psychological time is entirely an illusion, consisting mostly of ego-driven, unpleasant thoughts about the future, whereas clock time is real and objective. Being a sucker for a physics analogy, I couldn’t resist taking it a little further here. Note that “clock” time as I used it here is technically proper time under Special Relativity. I’ll leave it to the reader to decide how best to view time—illusory or not. ↩
The intangibles – speed, fluidity, and a well-designed interface with large, easy-to-read fonts
Search scope control – By default, Alfred only triggers apps, workflows, and other built-in actions. To search for files and folders, you can either type
find
or pressspace
. I love this "front-end fork" in the path I can travel to those two broad classes of objects.Workflows – Alfred workflows can only be explained by example and trying them yourself. Just a few that I use all the time:
- Common web service searches for Google, YouTube, Flickr, Amazon and many more.
- Evernote – Please just install this one if you use Alfred and Evernote. I especially love using
ent
to search by filename. - Paste as text – a simple workflow that does what it says in any application
- Custom search filters – For example, if I type
sp
, I'm doing a keyword search only over my Sublime Text 2 projects
- 1Password integration – go straight to login sites from Alfred
- System commands – eject volumes, restart, and more
- Basic calculations – Alfred is my primary calculator because it's always just a Cmd-Space away
- Command history – Simply arrow up to view previous things you've entered in Alfred. It's great for returning to previous file searches and calculation history.
Rest begets
Rest begets awareness.
Awareness begets prioritization.
Prioritization begets caring.
Caring begets achievement.
Adding a little more awesome to Alfred and Alpha
In case you missed it, Dr. Drang wrote a nice response to my post on evaluating LaTeX with Alfred and Wolfram Alpha. Though he doesn't use Alfred, the Professor of Python does write a lot of LaTeX, and it would shock no one that knows of him to learn that he quickly cooked up some .py
to query Wolfram Alpha from BBEdit.
For whatever reason, today I felt the urge to make a good thing better, and it occurred to me that someone out there on the internet had probably made a Wolfram workflow with more bells and whistles than the stock Wolfram workflow that ships with Alfred.
Sure enough, a little web search turned up a post from (of course) my good friend Gabe at Macdrifter, who pointed to me such a workflow written by David Ferguson.
David's workflow returns Wolfram results live to the Alfred interface without even having to go to the Wolfram site at all.
In early testing, it works flawlessly with standard LaTeX input. For example, here's the 1.9602 I was trying to get in the last post:
<img src="/img/img.png" alt=""/>
Pressing ⌥↩
even puts the numerical result on my clipboard. So yes, this is a way to evaluate LaTeX expressions right in Alfred without having to even lose sight of your LaTeX code.
You do need a Wolfram app ID for this to work, but you can get one free here.
Evaluating LaTeX code with Alfred and Wolfram Alpha
The LaTeX documents I create have a lot of numerical illustrations. If you write hundreds (maybe thousands) of numerical expressions in LaTeX over the course of a year, you will fat-finger a few digits along the way, and I really hate typos.
For example, my PDF output might have an expression like this:
<img src="/img/math-pe.png" alt=""/>
Quick. In your head. . . does that really equal 1.9602? Yeah, I don't want to do that either.
My old fashioned editing approach would be to hand check each one with a calculator like Soulver, typing in the expression after it's typeset to PDF and verifying the result. This is because the LaTeX code itself isn't in a format that can be pasted into a typical calculator:
\frac{0.9989 + \frac{0.9975}{1.04}}{0.9989} = 1.9602
A better way
I only recently found out that Wolfram Alpha can evaluate most LaTeX expressions in their raw form. Just paste a LaTeX expression into Wolfram Alpha's text input field, and see what it does.
For an even faster approach—avoiding the need to even leave your favorite LaTeX editor—use the stock "Ask Wolfram" workflow in Alfred:
<img src="/img/img.png" alt=""/>
The workflow takes you straight to the Wolfram site, which evaluates the expression in a second or two.
<img src="/img/img.png" alt=""/>
That's probably enough digits, Wolfram. But anyway, yeah, there's the 1.9602 I was hoping to see.
This is by far the fastest and most accurate way I've found to audit LaTeX numerical expressions on the fly. In fact, I've started using Wolfram Alpha to evaluate expressions in the first place. This lets me type LaTeX expressions as I "think" them and know that I can evaluate them in-line as I go.
Alfred, after all, is always just a Cmd-Space
away.
Update: For ideas on automating this process further with Python, see Dr. Drang's follow-up post.
Update 2: Here's an even faster way to evaulate LaTeX with Alfred and Wolfram.
Using Little Snitch to lower your LTE bill
Your Mac is a tiny but vociferous habitat amid an even more boisterous biome. The internet is a really loud place.
If you could somehow translate network traffic to an audible frequency, I’m sure it would sound like a rainforest of monkeys, tropical birds, frogs, bugs, and God knows what other creatures screaming out incessantly in a programed ritual of information mating.
But how loud should your Mac be? Who should it be allowed to cavort with?
Enter Little Snitch—a Mac application that alerts you any time any application on your Mac attempts to connect to the internet. You’re able to allow or deny connections on a permanent or temporary basis. Little Snitch groups these “rules” into profiles that can be network-specific or global. Best of all, as you join new networks, Little Snitch lets you assign them to profiles.
I use three Little Snitch profiles to muzzle the monkeys:
My Home Profile
On my home network profile, anything goes. I don’t have any data caps or security concerns at home, so I generally cut things loose. If home were the only twig in the internet rainforest I sat on, I probably wouldn’t need Little Snitch at all—though I do like how the menubar icon shows me if something is doing a lot of uploading or downloading.
My Public Wi-Fi Profile
If my home network is a tranquil pond of koi, public Wi-Fi is a muddy swamp full of piranha and pythons—with panthers patrolling the perimeter.
My Public Wi-Fi profile is much more restrictive. I’ve locked down just about everything except essential services, web browsing, and email. If I must connect to a public network, I want as little information flowing in and out of my Mac as possible.
My LTE Profile
When connected to LTE, the concern isn’t privacy predation. It’s data usage. And boy, does Little Snitch really help here.
Before I started using Little Snitch a few months ago, I was routinely running right up against my Verizon Wireless data limit around the 23rd day of each month’s billing cycle. My options were 1) impose a moratorium on LTE usage the last week of the month, 2) go over my limit and incur an overage charge, or 3) increase my data limit.
I stubbornly never chose option (3), meaning that every month I either had to give up the benefits of LTE or give up more money to keep using LTE.
I knew that Backblaze, my preferred online backup service, was part of the problem. Backblaze currently offers no way of restricting backups by network.1 And my Mac currently offers no way to change the behavior of applications on a network-by-network basis. As far as my Mac is concerned, a Wi-Fi network fed by LTE data is the same as any other Wi-Fi network.
To reduce the bleeding, I had to remember to manually pause Backblaze when connecting by LTE, and I frequently did not think to do that until it was too late.2
Now, Little Snitch essentially does the pausing for me. It’s as simple as permanently blocking bztransmit
the first time it tries to connect over LTE.
<img src="/img/img.png" alt=""/>
As soon as I began using Little Snitch this way, my LTE data usage issues went away completely. In fact, I barely climb above 2.5 GB of LTE data usage in any given month. Before, I would end up anywhere from 4.0 to 4.5 GB, putting me at or over my 4 GB plan.
Little Snitch lets me use just enough LTE on my Mac to be productive—mostly low-bandwidth web browsing and email.
My Cash Flow Profile
The economics here are what one might call intuitive. Little Snitch costs $35 once, and it saves me $10-15 every month. If you regularly hop across different networks, which each pose unique security and data usage challenges, I highly recommend trying out Little Snitch.
On productivity, self perception, measurement, and ethic
I believe that for changes to take long-lasting effect, they have to influence your identity, your core picture of yourself.
I love the grit and honesty in the whole thing.
The dead line
[caption id=”” align=”alignnone” width=”1500.0”] Ophelia by John Everett Millais (1852)[/caption]
Once you begin drowning, your mind stops being yours in a human sense. The drowning mind shifts into a very basic mode—a state that prioritizes primitive survival protocols over any kind of higher-level reasoning.
In these final moments, your mind returns to a relatively simple state. So simple in fact, you probably won’t even appear to be drowning. “Instinctive drowning response,” the mind’s primal reaction to the realization of drowning, was surfaced by Frank Pia, Ph.D. in the early 1970s.
“Except in rare circumstances,” Pia explains, “drowning people are physiologically unable to call out for help. The respiratory system was designed for breathing. Speech is the secondary, or overlaid, function. Breathing must be fulfilled, before speech occurs.”[1]
During instinctive drowning response, you won’t even be able to use your arms to signal for help. Instead, your arms will flap up and down in a surprisingly calm and inconspicuous manner. To an untrained eye, you’ll appear quite normal—just like those around you, or perhaps just like someone calmly treading water.
This process can last up to 60 seconds. After that, you slip beneath the surface.
But even then, your body won’t give up. As your vocal cords detect incoming water, they spasm involuntarily—a condition known as laryngospasm. In a last-ditch effort to save your lungs, your vocal cords will constrict, sealing your airway—diverting most of the water into your stomach.
If you’re lucky enough to be pulled to the surface and resuscitated within a few minutes, there’s a good chance you’ll be fine, especially if the laryngospasm kept your lungs dry. But absent help, you’ll quickly become unconscious (if you haven’t already). Brain function stops after a few minutes of oxygen depravation.
You die.
* * *
I’ve never drowned myself. All I know is what I’ve read—and imagined. But the more I think about it, the more I think that the final moments of a drowning victim’s life aren’t his worst.
I’m convinced the worst part of drowning is right now—the fear of it… the conscious mind’s simulation of the process of drowning. This uninformed, and hence fearful mind, imagines a frantic fight to stay above an infinitesimal line at which the bottom-most layer of our atmosphere sits on the top-most layer of water—that dividing line between the future, rising up infinitely high overhead and the deepest, darkest depths below.
These are not pleasant thoughts, and so it seems logical to drown them out of our consciousness into some other place—out of sight. But alas, not out of mind. We carry the fear of death with us everywhere we go.
This is a fairly new development in human history.
In Western Attitudes toward Death: From the Middle Ages to the Present, Philippe Ariès explains that in medieval times, “death was both familiar and near, evoking no great fear or awe, [which] offers too marked a contrast to ours, where death is so frightful that we dare not utter its name.”
Symbolically, when Ophelia drowns in Hamlet (1603),[2]
… she chanted snatches of old tunes;
As one incapable of her own distress,
Or like a creature native and indued
Unto that element…
Death was a medium into which people passed frictionlessly and almost instinctively. This attitude spread well into the 17th century as well. According to Ariès, “The spectacle of the dead, whose bones were always being brought up to the surface of the cemeteries, as was the skull in Hamlet, made no more impression upon the living than did the idea of their own death. They were as familiar with the dead as they were familiarized with the idea of their own death.”
You might even say that cemeteries were the original suburbs of older cities. When I had a chance to visit Paris’s largest cemetery, Père Lachaise, that’s what it felt like to me—walking around the streets of a large suburb of the dead.
But clearly, for most of human existence, death was something that virtually every man and woman had become accustomed to seeing since the dawn of their earliest memories. Death was a very accepted aspect of life because it was everywhere among the living.
It wasn’t until very recently in human existence, that our final moments became associated with alien-spaceship-like hospital rooms far removed from nature—filled with tubes, blinking lights, and the jarring sounds of machines. Before the 20th century, people rarely even left their homes to die. Death was something that happened quickly, and often it occurred in natural surroundings.
“In the 20th century,” Geoffrey Gorer writes in The Pornography of Death, “there seems to have been an unremarked shift in prudery; whereas copulation has become more and more ‘mentionable,’ particularly in the Anglo-Saxon societies, death has become more and more ‘unmentionable’ as a natural process.” In other words, death and sex traded places in the psyche of the western world.
Gorer wrote that in 1955, but the last sixty years have only punctuated his observation.
People live longer today than ever before—a lot longer. In medieval times a newborn would be expected to barely reach her 30s. Life expectancy crept up slightly in the centuries that followed, but in the 20th century, it soared into the 70s.
Because of longer lifetimes, it would appear that we’re now as distant from death as we’ve ever been—and we are—but only in the most objective sense of time.
By deferring death considerably, we set our thinking minds free to roam unimpeded on significantly longer time scales. But instead of finding more blissful lives, devoid of constant death, we wandered into a unprecedentedly morbid psychological time-scape. Unlike virtually every other living creature, “this is… our curse,” as Steven Cave says:
It’s the price we pay for being so damn clever. We have to live in the knowledge that the worst thing that can possibly happen one day surely will, the end of all our projects, our hopes, our dreams, of our individual world. We each live in the shadow of a personal apocalypse.
Slowly, but very surely, we traded death for the fear of death.
* * *
Putting aside religious views, once we die, we can no longer produce things on earth. Death is the ultimate deadline-in-common. We’ll all face it, if at singular and unknown times.
The very word deadline arose in the context of death—and a rather despicable one at that. One of its earliest mentions occurred in the trial of Henry Wirz, a Confederate officer who was executed for war crimes committed at Camp Sumter such as these:
… Wirz, still wickedly pursuing his evil purpose, did establish and cause to be designated within the prison enclosure containing said prisoners a “dead line,” being a line around the inner face of the stockade or wall enclosing said prison and about twenty feet distant from and within said stockade; and so established said dead line, which was in many places an imaginary line, in many other places marked by insecure and shifting strips of [boards nailed] upon the tops of small and insecure stakes or posts,… Wirz instructed the prison guard… to fire upon and kill any of the prisoners aforesaid who might touch, fall upon, pass over or under [or] across the said “dead line”…
If death is the ultimate deadline, then every imagined deadline between birth and death is like a miniature moment of mortality in our lives. A deadline might represent the death of a project, for example, or perhaps an unpleasant but necessary task. And just like death, the pleasant feelings obtained by deferring a deadline are ultimately overshadowed by the lingering worry about meeting the deadline again.
On the most elemental level, a non-deferrable deadline gives us a more acute sense of the death of time.
A cornerstone of Einstein’s Special Relativity is the proven fact that the gravitational force of large objects slows the passage of clock time. Metaphorically speaking, I think death has a similar “gravitational” effect on psychological time—in other words, time as we perceive it. As we near death (consciously), it exerts more and more pressure on our field of awareness—slowing down the present so that it doesn’t constantly evaporate into the past before we even knew it was there.[3]
Our own death is like our own black hole in the universe—our singularity—that infinitely dense but infinitesimally small moment when our psychological time stops.
The equivalence principle states that gravity’s effect on time can be reproduced precisely by the inertia of a moving object. In other words, as far as the universe is concerned, the space-time effects experienced while standing on the surface of a large object like the earth can be recreated by an upward-moving rocket that exerts the same degree of pressure on your feet. Time behaves the same in the presence of both.
If death and black holes are the respective end points for psychological and clock time, then deadlines are the large, but less extreme, gravitational fields of awareness that we pass by on our journey to death. Put another way, I believe that much like man-made moving objects can reproduce the clock time effects of gravity, man-made deadlines reproduce the psychological time effects of death.
In short, deadlines have a way of focusing us on the present. Everything scheduled after the deadline becomes exponentially less important as we approach deadlines. Productivity soars in the final moments before a deadline and can often exceed the cumulative work accomplished in the weeks before it.
Like the mind of a drowning victim, which simplifies and begins to focus on the present—albeit in a very primitive way—death and deadlines have a way of making us forget about the future. If death is right in front of us, there’s nothing on the other side to worry about. All the fears of the future, created by our wandering minds, stop. The future becomes so sublimely irrelevant that our entire existence becomes rooted in the present.
In this way, consciousness of death is the quintessential life hack. Anne Lamott gives great advice in her book Bird by Bird:
To live as if we are dying gives us a chance to experience some real presence. Time is so full for people who are dying in a conscious way, full in the way that life is for children. They spend big round hours. So instead of staring miserably at the computer screen trying to will my way into having a breakthrough, I say to myself, “Okay, hmmm, let’s see. Dying tomorrow. What should I do today?” Then I can decide to read Wallace Stevens for the rest of the morning or go to the beach or just really participate in ordinary life. Any of these will begin the process of filling me back up with observations, flavors, ideas, visions, memories. I might want to write on my last day on earth, but I’d also be aware of other options that would feel at least as pressing. I would want to keep whatever I did simple, I think. And I would want to be present.
Much like the future doesn’t exist in the mind of a very young child (try explaining the difference between six months from now and six seconds from now to a toddler), consciousness of death causes the future to matter much less for adults, who lose their sense of presence when death seems so far away.
Steve Jobs famously mentioned death as his own personal motivator when he addressed Stanford’s 2007 graduating class:
Remembering that I’ll be dead soon is the most important tool I’ve ever encountered to help me make the big choices in life. Because almost everything—all external expectations, all pride, all fear of embarrassment or failure—these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.
Jobs understood that only when we accept death’s certain victory, no matter how far off, do we feel free to finally tell the wardens of our own imaginary Camp Sumters to fuck off. And when we do that, life becomes the succulent meal awarded a dead man walking.
More than ever, I’m certain that death is the very best thing we have in life. It should be a deep, conscious spring from which all fountains of motivation are fed. It should not be allowed to sink into the depths of our subconscious to become an inexhaustible fuel source for our egos.
As the members of our species extend their expiration dates farther and farther into the future, it will become both harder and more vital to remain conscious of death. But stay conscious of it we must; because to live is not to stand back from death, but to step toward it—to toe that dead line.
Only the sound of the keys changed
[caption id=”” align=”alignnone” width=”500.0”] Photo by April Killingsworth via Flicker[/caption]
I don't remember ever seeing my high school typewriting class teacher smile. She seemed more like an animate autochrome of an early twentieth century secretary than a real person living in 1994. She was a stoic, yet feminine, drill-sergeant-of-a-woman whose life, while tangent to mine, had only one purpose, and that was to teach my classmates and me how to type just as fast and accurately as we could without looking at the keys of our typewriters.
It's unlikely that the school district of my rural, Deep South hometown had any intention of preparing us for an adult life dominated by computer use. It's far more likely that our fingers were being groomed, like decades of digits before them, to speak fluent QWERTY in case we went on to land post-Industrial Age white collar jobs that weren't white enough to have someone else do the typing for us.
Though it was an artifact left over from the mid-twentieth century, that typewriting class and the tortuously menial exercises I endured in it, were arguably the most practical academic experiences I had in high school. I learned how to type in that hell. Really goddamned fast.
Then it was 1997
I never touched a typewriter again after high school. By the time I graduated, typing was something that I did exclusively on a PC. It was never a question of Mac versus PC but more a question of WordPerfect versus Word and Quattro Pro versus Excel.
As far as I knew, Windows was as much a natural part of the human experience as death and taxes. The Mac—virtually unknown to me at the time—was facing the former, and Windows was certainly the only way I knew how to do the latter.
Then it was 2008
Much like the religion you're born into, you don't really question the computing platform that greets you first. It's really not until the operating system—and the software on top of it—start failing you on a spiritual level that you become less able to swallow "that's just how things are."
That's where I found myself in the spring of 2008—sort of questioning, if not outright losing, my religion. I was sick of Windows, and I needed a new faith.
I had also been eyeing this guy named Merlin Mann who was doing things that seemed to me, at the time, so indulgent that they bordered on productivity perversion—at least in the context of the pale blue, point-and-click world I'd always known. Merlin was exalting a way of life that would surely draw stares anywhere outside of San Francisco, I thought. Hell, I was staring. I couldn't stop.
Merlin was talking to his computer, through his keyboard, but not at a command line. He was getting to files really quickly, and doing things with those files by typing verbs. He was puppeteering his computer in ways I'd never even pondered—but in ways that made "computing" seem so much more organic than anything I'd ever imagined.
Merlin was in love with Quicksilver, and anyone reading 43folders knew it. More broadly, Quicksilver was quickly supplanting LaunchBar, a Mac application launcher with roots in NeXTSTEP.
Quicksilver, as much as Mac hardware and OS X, made the Mac irresistible to me.
Though application launchers—a label that's as accurate as "phone" for iPhone—are not a part of OS X, they feel connected to me in time and virtual space because they both arrived in my life at the same cosmic moment.
The inherent speed and stability of OS X over Windows—and Quicksilver, the way it flattened my digital world—changed the way I worked with computers on a very fundamental basis. On a more meta level, I'd found a community that I could identify with. The passion of the people using these tools made everything feel more like an experience rather than a means to an end.
I don't think it's a stretch to say that everything I've done in work and pastime since 2008 was affected by those events.
Then it was 2009
My time with Quicksilver would be short-lived. By the time I found Quicksilver, it had already been put out to open source pasture by its then-mysterious maker, Alc0r (who we now know was Nicholas Jitkoff).
LaunchBar was making a comeback, and though its feature set wasn't as expansive as Quicksilver's, LaunchBar seemed faster and more responsive in Snow Leopard.
I switched to LaunchBar in September 2009 and happily used it for over three years.
Then it was 2013
When I upgraded from Lion to Mountain Lion in the summer of 2012, LaunchBar didn't feel the same. It was noticeably slower. I had to restart it often to restore the snappiness I'd always enjoyed prior to Mountain Lion. A later LaunchBar update resolved some of my issues, but it was never quite the same.
After ignoring Alfred for most of its life, I decided to give him a try. I figured anyone with enough confidence to wear a purple ribboned bowler hat might know a secret or two about life that I didn't.
I was right. With Alfred, I fell in love all over again with the speed of a really snappy application launcher. There are many things I like about Alfred:
And now it's 2014
I can't say for sure that I'll always use Alfred, but I can say that given how I was nurtured and natured with a keyboard, I will always prefer a "production" computing platform that has app-launcher-like abilities.
Even though each year brings new interfaces—from touch screens to speech-based UIs to things you wear on your face—interacting with a computer with keystrokes remains the fastest and most natural way of talking to a computer I've found.
Online education: We're learning what doesn't work
As the William Lowe Bryan quote goes, "education is one of the few things a person is willing to pay for and not get." What, then, does that say about free online college classes like those offered by sites like Udacity?
...all of these efforts have been hampered by the same basic problem: Very few people seem to finish courses when they’re not sitting in a lecture hall. Udacity employs state-of-the-art technology and sophisticated pedagogical strategies to keep their users engaged, peppering students with quizzes and gamifying their education with progress meters and badges. But a recent study found that only 7% of students in this type of class actually make it to the end.
I believe that the web is a legitimate place to teach, but I don't think educational content should be commoditized like tech news or cat videos. I think the more an online education platform relies on volumes for profits—or the more a business model uses "education" as an eyeball-getter for some other purpose—the poorer the educational product will be. That's because, like other free(ish) internet things, the product and customer are often the opposite of the ostensible.
Just because the web has driven the cost of information to zero, don’t assume it will or should do the same for education. The two are very different.
The web is a candy land of information. Facebook statuses, Google searches, even Wikipedia entries all exist because people have a sweet tooth for information. Instant-information sites are great, I guess, but I don't think the business models that built such sites lend themselves to educational business models.
Education is much more than the distribution of information. An educated person is much more than an informed person.
Education should cost (someone) something. People who consume education should pay for education either as a product or as a service. And people who educate should be compensated based on their ability to educate, not their ability to create web traffic.
I also think calling this first generation of free education sites "online universities" is misguided. I don't think the university experience, in the pre-21st-century context, can be reproduced online. University, the product, is something people purchase for reasons that go way beyond lecture hall learning. The social experiences that happen in the conventional college setting can't be emulated online. And they shouldn't be.
I think web-based education will continue to evolve, and most likely become less core and more niche. The very best online education platforms will charge for their products and offer a very clear value proposition to their customers, who will buy (and complete) the education as a stepping stone toward practical economic goals.
Maybe instead of using the web to "innovate" education by gamifying and enabling the already-short attention spans spawned by the instant-information-gratification era, online educators will innovate the web by figuring out ways of re-introducing critical thinking into learning.
I'm not saying this will be an easy task, but I think it's one worth taking on. Otherwise, by molding "education" to the Facebook-status-quo we surrender to one of the greatest ironies of the web: it's so open, yet most of us tend to curl up in one corner of it and nurse the same bottles of highly liquid, nutritionless information we did the day before.
The MacSparky Email Field Guide
By now, you've likely heard that David Sparks has released yet another Field Guide. In this one, he had the onions to take on email. I have to admit, back when I heard David was working on a book about email, I was a bit worried.
Email seems to be a subject that either invokes vomit or apathy depending on the person. How can someone write an interesting book about email?
Well, the only way I know to answer that question is this: get David's book. He really hit it out of the park. I thought I knew everything there was to know about email, but I quickly found out that David knew more. Buying Email should be a pretty easy decision. Who wouldn't want to get better at something they spend hours a day doing?
David's approach for this book in some ways reminds me of the Nest thermostat. David took a mundane, but infinitely pervasive problem—email, that is—and assembled a set of solutions that are as practical as they are elegantly presented.
I challenge you to find a more aesthetically pleasing piece of technical writing. Mike Rhode's original illustrations are fantastic.
Don't bring a knife to a punch down fight
"You're not gonna do this with a pocket knife."
So says Gabe Weatherhead, the grittiest-DIY-hero-of-geeks you may ever have the pleasure of hearing online, before taking us on a fascinating journey through the conduit of his home network.
As someone who finds himself sewing more and more of his devices together with twisted pairs of copper, I really enjoyed the first episode of Gabe and Erik's new Technical Difficulties podcast. And this is just the start.
So put down your pocket knives, kids. Put in your earbuds, and get out your credit cards.
A net as big as the sea
With billions in funding, the N.S.A. is able to spy with nearly unthinkable scope:
The N.S.A. hacked into target computers to snare messages before they were encrypted. In some cases, companies say they were coerced by the government into handing over their master encryption keys or building in a back door. And the agency used its influence as the world’s most experienced code maker to covertly introduce weaknesses into the encryption standards followed by hardware and software developers around the world.
The most disturbing message in this article to me, however, is the tacit message that current and future U.S. national security depends on the federal government's ability to know every communication and transaction that takes place online.
Maybe modern warfare hasn't changed all that much. This is the 21st century's take on carpet bombing.
Leaning back
More time thinking. Less time overtly doing. That's really the gist of this highly quotable Economist piece on "leaning back" from work.
Creative people’s most important resource is their time—particularly big chunks of uninterrupted time—and their biggest enemies are those who try to nibble away at it with e-mails or meetings. Indeed, creative people may be at their most productive when, to the manager’s untutored eye, they appear to be doing nothing.
And this is why people are so afraid to step off their hamster wheels in that highly inefficient physical location most people call "work."
Retro tip: Cmd-P-P
If only I could quantify all the positive ripples of this classic MacSparky "Save as PDF" keyboard shortcut for the last five years. Yes, it's been over five years since he dropped this gem on the internet. I think I've used it every day since.
Crazy talk about email
I had the great fun and honor of joining buddies Gabe Weatherhead and Erik Hess on the Generational podcast. We talked about email, but instead of swimming with the usual email narrative currents, we talked about how we actually like email.
So far I haven't received any death threats, and the universe seems to be intact. So far.
Dispatch: the mail app I didn't delete
I really like Dispatch, and this is coming from a guy who tries all the new email apps but never writes about any of them. I feel compelled to say something about Dispatch, though. I've been using it daily since Federico Viticci reviewed it.
Most new email apps that I buy get deleted in less than a week, but Dispatch is sticking. Here are a few reasons why. . .
Native processing
I can send links straight to Instapaper for reading later, fire emails directly into the iOS OmniFocus app, and even send messages straight to Evernote. I love being able to act on mail locally (as I would in OS X) without forwarding mail to some hacky email account first.
Gestures that work
People gave a lot of praise to Mailbox, but I never could master the swipe-to-archive-but-not-too-fast-or-else-you'll-delete technique. Not only is Dispatch's swipe-to-archive gesture more intuitive, there's always an undo option for any action you take.
Making mistakes while using any app is inevitable. The undo button that appears atop the app after every action is one of the single best features of the app.
Stars
I can star email with Dispatch. It sounds unremarkable, but starring mail is a hugely fundamental aspect of my email workflow, and it amazes me how many other apps don't support Gmail stars.
Message first, subject second
This is a nuance that only an email snob like me would comment on. But I really appreciate the way that Dispatch conceals the To and Subject lines when composing a new message. It has a similar feel to Drafts, where you naturally prioritize writing content before titling and sending content. This is just how I think things should work, and I appreciate this subtle design feature a whole bunch.
Snippets
The snippets feature in Dispatch is really useful. Being able to preload common chunks of text is a major productivity boost on any device that doesn't have a full-sized keyboard. But like Federico mentioned in his review, I just don't get why Dispatch didn't simply hook into TextExpander. This is really my only complaint about the app so far.
More broadly, I don't get why someone hasn't made a good email app that supports TextExpander. I really don't. It seems like such an obvious void to fill.
I'm excited
I'm excited to see where Dispatch goes from here. It's a beautiful app made by people who clearly think about email the way I do. I hope it only gets better.