As a long-time user of iA Writer (I’m literally writing in it right now) and fan of iA, I was delighted to see the announcement that a tangible paper notebook was coming: Notebook for Writers. Of course it’s beautiful. I appreciate the subtle guidelines on each page, and the description is exactly what I would expect from this team:
We wanted guidelines in the notebook, serving as a temporary scaffold to support your writing without causing distraction. The design had to reflect the core spirit of iA Writer: simple, clean, uncluttered. The use of ink had to be exclusively reserved for the owner’s handwritten text. After exploring different approaches, the decision to adopt delicate watermark guidelines became clear, even though its implementation demands high technical expertise and attention to detail.
To be honest I have tried to become a notebook person many times and failed. Moleskines in multiple sizes and shapes, conference and meetup giveaways, and even a black notebook paired with a white pencil didn’t work. Perhaps I’ll try again.
Opal Camera recently launched Tadpole, their second product. It’s adorable. When their first product, the C1, appeared I immediately signed up for the waitlist. It was exciting to see a startup working on a webcam that both looked cool and had a “DSLR quality” sensor. I waited. And waited.
When the C1 finally arrived I had already constructed a complex camera setup using my Fujifilm X-T2, and even found a way to eventually simplify it with fewer parts. The image quality was flawless. I had hoped that the C1 would be somewhat close to the Fujifilm, but it was only marginally better than the LG 5K’s built-in webcam. Opal’s software for adjusting the camera’s settings was powerful but slow. The camera would get very hot, and its mount was not reliable. It actually fell off of my display a few times and became dented.
I eventually gave up and sold the C1 once Continuity Camera launched. I highly recommend using your iPhone as a webcam instead of any third party cameras including DSLRs. The biggest challenge was securing it to the LG 5K display or MacBook Pro, but that was solved when Belkin released the iPhone Mount with MagSafe for Notebooks and Mac desktop and displays. Yes those are exceedingly long product names.
Setup is easy because there is no setup. When a Zoom, Google Meet, FaceTime, etc. call starts the iPhone camera activates. Place the camera on the Belkin mount and done. I recommend connecting your phone to power to ensure its battery doesn’t deplete by the end of the workday.
Considering Opal couldn’t make a good webcam with a large sensor, I recommend avoiding the smaller (but still pretty) Tadpole.
A look back at key products to help me determine if I should order a Vision Pro on launch day or wait for v2.
I’ve been thinking a lot about whether or not I will order a Vision Pro when it becomes available or if I will wait for v2. Based on early reports it sounds magical. I have not heard a single person who has used the device say otherwise. However, it is incredibly expensive. At $3,499 it would be the most expensive computer I’ve ever purchased (including a maxed out PowerBook I bought at the end of my Apple internship in 2002 with a 25% discount).
While chatting with a friend about the pros and cons of waiting, I began reciting the differences between a few first and second generation Apple products. “Wait a minute,” I said. “This would be a great blog post! Let me check some details.” I thought it would be fun and helpful to take a close look at several products and get a sense for how often it’s actually worth waiting. We can see what features were the highest priority to immediately add. I decided to focus on a few key v1 products: the Macintosh, PowerBook, Power Mac, iMac, iPod, MacBook Pro, iPhone, iPad, Apple Watch, and AirPods.
I’ll be honest. Much of this analysis is swayed by my current age, disposable income, nostalgia, and where I was in life when these products were announced.
TLDR: I’m going to buy a Vision Pro even though I should wait for a v2.
Macintosh 128k vs. Macintosh 512k
Since I was born in 1983 I didn’t get to experience the rise of GUIs and personal computers with the Macintosh. In the late 1980s I remember playing with gigantic, noisy, ugly Compaq and Toshiba boxes that ran DOS at friends’ houses while at school there were slick, elegant, and easy to use Macs.
Here are the differences between the original Mac 128k and its quickly arriving successor, the Mac 512k:
Spec
Mac 128k
Mac 512k
Mac 128k
Mac 512k
Launch Date
January 1984
September 1984
RAM
128 KB
512 KB
Latest Mac OS
1.0
3.2
Cost
$2,495
$2,795
Just 8 months after the Macintosh 128k’s debut the 512k model was released. The big change was an increase in RAM which improved performance and allowed the Mac to run more software including future versions of Mac OS. However, the Mac 128k was not a new product line in an existing category like an M3 MacBook Air. It was a paradigm shift; a new way of using computers. The excitement after the 1984 Superbowl commercial must have been immeasurable. Perhaps this a result of hindsight and my selfish desire to build out my museum, but I believe I would have purchased a Mac 128k on launch day and not waited for v2.
Macintosh 128k vote: buy
PowerBook 140 vs. PowerBook 145
I first encountered a PowerBook in either 1993 or 1994 when I began playing the alto saxophone in elementary school. The music teacher connected a PowerBook to a MIDI keyboard to play accompanying music. For a kid who was excited by computers it was extremely cool to see a computer working with other hardware.
Note: I chose the 140 and 145 because there was no successor to the 100.
If you waited just 10 months you got a faster processor, more RAM, and a lower cost with the PowerBook 145. Similar to my views on the Mac 128k, this was another revolution in computing. Apple released the Macintosh Portable in 1989, but it was wildly expensive at $7,300 ($17,200 adjusted for inflation) which seems slightly unrealistic. The PowerBook was smaller, lighter, and more affordable. You could now take your Mac off your desk and work on the go! I don’t think waiting for v2 was worth it.
PowerBook vote: buy
Power Macintosh 8100 vs. Power Macintosh 8500
I first encountered a Power Mac in a lab at summer school where I learned about building websites, graphic design, animation, and 3D modeling using applications like Strata Studio Pro, Bryce 3D, Photoshop, and Director. On Fridays we got to play Marathon and Warcraft II all day. It was a good summer.
When the Power Mac line was announced we already had the very capable Quadra line. The 601/604 PowerPC processors were exciting (my dad’s Performa 6115 had a 601 and my Performa 6300 had a 603e), but I think it was safe to wait a year for a v2 while continuing to rely on the weathered but experienced 68040. Waiting got you double the storage, double the RAM, a graphics card, better expansion, and the much more powerful 604 processor.
Power Mac vote: wait
iMac vs. iMac (Slot Loading)
In the late 90s I had aging Performa 6300 with an optional AV card that served me quite well. My priorities were playing Marathon and writing essays. The platinum design language was ready to be replaced.
At that time I worked at a local third party Apple retail store named Computerware and sold a lot of iMacs. People loved the colors and simplicity. Everything you needed was in the box. Jeff Goldblum elegantly explains this in a commercial, “Presenting 3 easy steps to the internet. Step 1: plug in. Step 2: get connected. Step 3: there’s no step 3.”
Note: I chose to skip the iMacs released 4.5 months after the initial launch due to the extremely short timespan and hardware similarities. They feel more like a bonus round than a v2.
The original iMac in its beautiful bondi blue case was a statement and it belongs in museums. Apple was back and computers were fun again. Specs? Who cares! Well, I did. Disc drives were quickly becoming useless thank to the internet, but I still relied on a SCSI port for several more years thanks to the Iomega Zip drive and a CD burner. I was not ready to jump on the USB bandwagon. The updated specs that arrived in the slot loading model the following year were worth waiting for in addition to seeing how the industry reacted to USB.
Waiting also got you a faster processor, more storage, and more RAM. The bondi blue color option was no longer available, but graphite was.
iMac vote: wait
iPod vs. iPod (Second Generation)
When the iPod was announced I was a freshman in college. Mp3s were playing in every dorm room, at every party, and I assume in every dining hall. Legally? Doubtful. At some point most of us stopped ripping CDs and turned to Napster, Scour Exchange, Kazaa, LimeWire, etc. At Computerware I sold a few Creative Nomad Jukebox mp3 players that were intriguing but unappealing. There was clearly another paradigm shift occurring, but a solution to manage, transfer, and traverse a large collection of mp3s had not presented itself… yet.
By the time I arrived at college I had embraced and experimented with a variety of ways to collect, organize, and listen to music. In 2001 I had already moved on from a Rio 500 and heavily invested in a Sony MiniDisc player. I gleefully walked around campus with a little notebook of discs and swapped them multiple times each day. The MiniDisc ecosystem combined with a very easy way to quickly download mp3s in my dorm room made me hesitant to immediately purchase an iPod. I felt it was safe to wait until a v2 (which I definitely purchased). Once the iPod was paired with the iTunes Store there was no going back.
iPod vote: wait
MacBook Pro vs. MacBook Pro (Refresh)
The MacBook Pro replaced the beloved PowerBook G4 Titanium. Yes, the titanium paint scratched easily. Yes, there were issues with the little magnetic hook that kept the lid shut. Yes, the fans roared when I played Medal of Honor. But it felt so good to say the word “titanium” when describing a laptop. For my freshman year in college I convinced my parents to ship my Blue and White Power Mac G3 from California to New York. It was gigantic in a tiny dorm room. After my summer internship at Apple the following year I purchased a PowerBook G4 Titanium which was more suitable for a student lifestyle. A few years later Apple surprised the industry by announcing a transition from PowerPC processors to Intel processors, and the first Apple laptop with an Intel
processor was the MacBook Pro.
It’s worth reflecting on the MacBook Pro name considering it started in 2006 and we still have it in 2023. The PowerBook name lasted 15 years (1991-2006). The MacBook Pro name has lasted even longer! We’re at 17 years and there’s no sign of changing it. I would argue there was an opportunity to create a new name to coincide with the Apple Silicon transition, but that didn’t happen.
When the MacBook Pro was announced we were all using titanium or aluminum PowerBook G4s, and they were fantastic machines. Coworkers at the startup I was working for at the time had 12, 15, and 17-inch PowerBooks. Sticking with the PowerBook avoided any software hiccups during the transition from PowerPC to Intel architecture. Waiting just 10 months also got you the Intel Core 2 Duo processor which significantly extended the life of the computer. It was clearly safe to wait for v2.
MacBook Pro vote: wait
iPhone vs. iPhone 3G
When the iPhone was announced I was using a boring Motorola v551 flip phone. I had run through a series of Ericsson and Sony Ericsson phones that synchronized contacts and calendars over Bluetooth with iSync. Eventually those capabilities on candybar phones fell out of fashion with the rise of pre-iPhone smartphones. The leaders at the startup I was working for had a variety of these devices. Blackberries, Treos, BlackJacks, etc. One would think as a tech enthusiast I would have purchased one, but I vividly remember disliking all of them. The Sony Ericsson T68i and T610 were tiny and powerful. The smartphones mentioned above were gigantic and barely more capable. It didn’t seem like a valuable tradeoff. Then January 2007 came and
Jobs delivered arguably his best keynote.
Here are the differences between the iPhone and its successor, the iPhone 3G:
Spec
iPhone
iPhone 3G
iPhone
iPhone 3G
Network
EDGE
3G
Storage
4 GB
8 GB
GPS
✓
Latest iOS
3.1.3
4.2.1
Launch Date
July 2007
October 2008
The iPhone was obviously a day 1 purchase. No question; no hesitation. I did not have experience with 3G connectivity so I didn’t feel like I was missing out. EDGE was painfully slow but it didn’t matter. The experience of using an iPhone was worth living with the slow (and unreliable) AT&T network. Also Google Maps worked surprisingly well without GPS.
I saw an iPhone up close 3 months before launch at a car meetup in San Jose. There was an Apple VP showing off his Ferrari, but everyone there was more interested in the pre-launch iPhone he was demonstrating. Launch day came and I lined up outside the Palo Alto Apple Store for 2.5 hours before it opened. I got one. It was exhilarating.
iPhone vote: buy
iPad vs. iPad 2
For a few years pundits praised computer companies for producing cheap, crappy, miniature laptops referred to as netbooks. They had poor specs, tiny screens, and terrible keyboards. Heck if you weren’t careful and selected the cheapest option a netbook would show up running Linux. People assumed (and hoped) that Apple would join the fray with its own take on the netbook. Fortunately they never did. Instead, Apple waited until the netbook faded away and the tablet category arrived. Microsoft debuted its “slate” computer at CES in 2010, and, after years of speculation regarding a more portable portable computer, Jobs introduced the iPad.
Here are the differences between the iPad and its successor, the iPad 2:
Spec
iPad
iPad 2
iPad
iPad 2
Processor
A4
A5
RAM
256 MB
512 MB
Cameras
✓
Latest iOS
5.1.1
9.3.6
Launch Date
January 2010
March 2011
Jobs sat in a Le Corbusier Grand Confort Lounge Chair on stage, reclined, and held up the iPad. I was sold. The first casual computer. I wanted both the chair and the iPad, and, somehow, the iPad was a shockingly affordable $499 compared to the rumored $999. With the accompanying iPad Keyboard Dock I believed I had unlocked a new productivity setup. However, the iPad was and continues to be a consumption product. I have never embraced it as a primary computer. I just love moving files around and looking at multiple windows simultaneously too much. The iPad wasn’t another paradigm shift, but it still had a magnetism and curiosity that made it a launch day purchase.
iPad vote: buy
Apple Watch Series 0 vs. Apple Watch Series 2
When the Apple Watch was announced I was proudly wearing a TO watch by Issey Miyake. I still love that watch despite the difficulty of telling time when the lengths of the hours and minutes hands are reversed. I had dabbled in the smartwatch category by backing the Pebble on Kickstarter in 2012. I wore it for a bit but quickly found it clunky and unhelpful. I recall the notification tap feeling particularly cheap.
Years had passed since the iPad launched and the industry was curious as to where Apple would venture next. The announcement was fun but left many questions unanswered. What would an app experience be like on such a small screen? Do I even want to send my pulse to a friend? Most importantly, will the battery last all day?
To be candid I purchased an Apple Watch on day one, but upon reflection I do not believe it was vital compared to some other Apple products. Both the core functionality and third party apps were slow and the battery life was mediocre. In terms of software it took a few iterations of watchOS for Apple to learn what the watch excelled at and focus. Hardware quickly improved though. Series 2 pushed the battery life towards an acceptable capacity where I wasn’t worried it would deplete to 0% if I did a long workout.
Apple Watch vote: wait
AirPods vs. AirPods (Second Generation)
I love headphones. Love. When Shyp was around I would buy headphones, try them on, and immediately call Shyp to pick them up for a return. Before the AirPods and AirPods Pro launched I was a happy Bang & Olufsen Beoplay H3 owner. They were elegant and I enjoyed their sound profile. I dabbled with Bluetooth headphones starting in 2013 for running, but still used wires for calls and commuting.
The first generation AirPods were fantastic. Perhaps not a paradigm shift or leap in computing, but they certainly reset the Bluetooth headphone industry. Now almost every pair of Bluetooth headphones follow the same concept. Cases had been used as charging mechanisms in the past, but sound and satisfaction of placing the AirPods into the case and shutting it were unmatched. It was absolutely not worth waiting almost three years(!) for a v2. Although I did purchase the v2 version because I’ll always jump at improvements to connectivity and battery life.
AirPods vote: buy
Tally
Let’s review the list and votes:
Tally
Buy
Wait
Buy
Wait
Macintosh
✓
PowerBook
✓
Power Mac
✓
iMac
✓
iPod
✓
MacBook Pro
✓
iPhone
✓
iPad
✓
Apple Watch
✓
AirPods Pro
✓
5 votes for “buy” and 5 for “wait.” An even split. I’m surprised by this! Before conducting this exercise I assumed there would be fewer “wait” votes. I’ve learned nothing except perhaps I’m more mature than I realize.
Vision Pro’s Value
When applying this breakdown to the purchasing decision for the Vision Pro, it’s worth noting which items in the list were exorbitantly expensive and which were affordable. For example, the Macintosh and PowerBook were way more expensive than the iPod and iPhone. At $3,499 the Vision Pro is not a product one buys on a whim.
But what about value? I use my iPhone and AirPods Pro constantly so the value to cost ratio is reasonable. Will I wear the Vision Pro for 8 hours per day during the work week? I can’t use it in meetings if someone else is in the conference room. That would be awkward. If it makes me more efficient at work and video calls become more enjoyable, perhaps it’s worth investing in just for work. I certainly won’t wear it at night if I want to stay married. I also won’t wear it during weekends since I have two kids under three. No time for dad to play with his toys.
Assumptions About a Vision Pro v2
Based on all the products discussed above and the components in the Vision Pro I think we can make some assumptions regarding inevitable improvements in a v2:
It’s weird to toss around a word like “magic” when referring to tech products. The word should be reserved for moments when the future becomes obvious. For example, the first time I used a web browser the world felt infinite, and it was obvious that I would use a web browser every day. The first time I streamed video in a web browser content felt infinite, and it was obvious I would watch videos every day. I wonder if the first time I see a blinking cursor and start typing on a virtual keyboard in a Vison Pro will computing feel infinite?
Image credit: Apple
Raymond Wong describes his experience with Spatial Video as “emotional” in a recent post on Inverse:
…These convos are very precious to me, so to see them replayed with a sense of presence really tugged at my heartstrings. At one point, I fought back a few tiny tears if only because there were three Apple reps sitting next to me… At a certain distance and window size, spatial videos can look life-sized. But even when I “pushed” the video window farther away (enabled by looking at the bar at the bottom of the window and then pushing it farther from me), seeing my mom in 3D made me emotional.
Outside of productivity and entertainment, it sounds like the Vision Pro can create “emotional” experiences unlike any other computing platform. That’s where I see the magic potential. Lately my daughter insists on calling grandmothers, aunts, uncles, and cousins on FaceTime during breakfast every day. I’m delighted that she can see family members around the country whenever she wants. A few years ago this required a laptop, webcam, and iChat AV. Before that it was even trickier. Now it’s a single tap on a device we all carry. Perhaps Spatial Video will be the next leap in human connection.
Decision
It’s hard to believe after writing all of this I’m still wavering. Acquaintances who have used a pre-launch Vision Pro claim it’s incredible. They insist waiting for a v2 is unnecessary and, knowing me, unlikely. Let’s be honest here: I’m going to buy a v1. I’m just not sure if it’s going to be day one or after I get to play with one.
Thank you, Remy, for proofreading and shouting “What? You’re buying a Vision Pro? I don’t think so.”
I love gadgets, and becoming a parent greatly expanded the number of products I get to research, configure, and maintain. This also means I occasionally have an opportunity to go deep on a new technical problem. The latest example is with the Nanit Pro Camera. It’s a small WiFi camera you mount to the wall or place on a stand, and it allows you to see live video of your baby using a mobile app. This is different from a EuFy Baby Monitor (yes two cameras pointing at the same baby) which broadcasts to a dedicated monitor using RF instead of WiFi. Using a EuFy is helpful because you don’t need to launch an app to see what’s happening in the baby’s room; you just glance at the portable monitor. It’s important to understand the difference: the Nanit broadcasts the video feed to a server while the EuFy broadcasts to a local device.
My wife, son, and I recently visited Philadelphia so he could meet his great grandmother (don’t say that “nasty” word around her) and only brought the Nanit. The hotel provided a crib and we used a SlumberPod (a must for all parents who travel with babies) to ensure Wolfe had a dark environment to sleep in. After constructing the SlumberPod, which conveniently has a small pocket to place a camera in, I began the process of connecting the Nanit to the hotel’s WiFi network. This is where the trouble began.
The hotel’s WiFi did not require a password. You simply connect to the open network and authenticate using your last name and room number in a webview. We’ve all seen this before at hotels, offices, airports, etc. The problem is the Nanit’s network settings require both an SSID (network name) and a password. There is no way around this. Fortunately the hotel also offered a password-protected network so I tried that and consistently encountered “network error 400.” The only way to make this work was to get around the passwordless network.
Those margins are a bit overkill
I searched DuckDuckGo, asked ChatGPT for help, and browsed Reddit. Nothing. Other users complain about the password requirement, but Nanit has not offered a solution in the app. Finally I checked Nanit’s documentation and learned that their cameras don’t work with “captive portal” networks which use a webview to authenticate. The way to make it work is by using a personal hotspot or a travel router. Now I was intrigued!
Personal Hotspot
Let’s start with the personal hotspot because this took some thinking and tinkering. Here is the process I came up with:
Activate the personal hotspot on my iPhone
Connect to the Nanit with my wife’s iPhone
Configure the Nanit’s network settings to talk to my iPhone’s hotspot WiFi network
Now whenever I turned on the personal hotspot on my iPhone, the Nanit would automatically connect to the phone after a few seconds, and start broadcasting the video feed to the server. This technically worked but the Nanit would chew through my data plan if I left the camera connected to the hotspot network all night. I had to find a better solution.
Travel Router
I recalled listening to an episode of The Vergecast and the hosts discussed the convenience of traveling with a portable router. If you use the same SSID and password as your home router, all of your devices will automatically connect to the portable router. Phones, tablets, laptops, and, most importantly, Nanit cameras. I was delighted to finally have a reason to buy another tech product.
I ordered a GL.iNet GL-SFT1200 on Amazon for $39.90 and luckily had an option for one-day shipping. I was impressed by both its low cost and small size. It’s perfect for throwing into a suitcase. If a future trip includes my wife and I both on video calls, I will probably purchase a version that can handle more bandwidth. However, for a single Nanit camera I assumed it would suffice.
GL.iNet GL-SFT1200
The next step was to configure the router to extend the hotel’s WiFi network into a new network with my home’s SSID and password. This would allow the camera to connect automatically as discussed above. Here are the steps I followed:
Connect the router to power
Connect your laptop to the router’s WiFi network which starts with GL-SFT1200...
Open a browser and go to 192.168.8.1 (yes that should look funny to my old Linksys WRT54G owners)
Set an admin password for the router
Now you’ve reached the admin portal. The goal is to make the GL.iNet extend (or repeat) an existing WiFi network. Click “Scan” in the Repeater area to set this up.
Use the dropdown menu to select the SSID of the WiFi network you want to repeat and enter its password.
Wait a few seconds and you’re done. Seriously. It’s that easy.
I checked the Nanit app and… boom. Live video.
This worked beautifully for extending the hotel’s WiFi network. If you have access to an existing router it’s even easier. The entire family recently took a trip to an Airbnb in Carmel which had a bunch of eero routers. I simply connected the GL.iNet (gosh that’s a bad name) to an open ethernet port on the eero and I was done. Both Nanit cameras automatically connected to the GL.iNet’s network. Zero configuration. Zero frustration. Two sleeping kids.
I highly recommend purchasing one of these for your next trip. Your family will appreciate all of their devices instantly connecting to the internet.
I recently read an article on Fast Company titled “These charming tools are a radical vision for how you’ll use your computer.” The article discusses a concept created by Approach Studio that asks the question: What if we updated physical interfaces for the digital ages? How might they look different? I watched the video and gifs several times and reflected on why I appreciate certain physical products: tactility.
One can appreciate, grow attached to, or even love the way certain products feel when interacting with them. For example, pressing the start engine button in a car you enjoy driving, turning a knob on a coffee grinder, or even feeling haptic feedback after tapping a button in a beautiful mobile app. Think about the power button or switch on a product that is a part of your life. For example I vividly remember the switch on the back of my Mac LC and the button on my dad’s Performa 6115. Each had a particular feel and sound that contributed to a moment of anticipation before the old Mac OS startup chime. (Of course that sound sometimes meant dread if I was waiting to use Microsoft Word for writing an
English class essay.)
Approach Studio goes further than presses and sounds by bringing concepts we have learned since the dawn of mobile computers out of the digital and into the physical. These demos are easy to quickly scroll through, but I want you to take a moment to reflect on how each can be an improvement to the home.
Image credit: Approach Studio
First, think of a typical switch that you flip every day. Perhaps a small and subtle light switch or an old thermostat’s mode switch for example. Now imagine a switch that provides colors to indicate its state, has a slight tension as you slide it, and ends with a reactive bounce to feel alive. This would be more enjoyable to use because of its playfulness, and it would add character to your home with its aesthetics. (Ideally one would be able to customize the color. I’m not sure my wife would approve that shade of green.) Something in your home that you observe and touch every day should both look appealing and feel good to control.
Image credit: Approach Studio
Next, think of some dials you turn. Fans, thermostats, stereos, etc. When the factory ships a product with dials they are a fixed size forever. What if instead a dial could adapt to specific tasks? This is reminiscent of Steve Jobs’ initial explanation of the iPhone’s large, multi-touch screen in 2007. Instead of fixed buttons which Blackberry, Motorola, Handspring, and Samsung phones had, the iPhone could adapt to different tasks. Approach Studio demonstrates this with a dial that grows and shrinks depending on how accurate the user needs to be in the moment. A small dial is ideal for a low number of options (adjusting a lamp’s brightness from 4 to 5), and a large dial is preferable when the user needs to be accurate (adjusting from 66 to 72 degrees on a thermostat). A dial that can grow and shrink depending on the user’s needs can allow one dial to control multiple products.
Image credit: Approach Studio
Lastly, if you are currently on a laptop and desktop computer, try hovering over a few links or buttons. Go ahead. The hover state is a subtle way computers can alert the user that there are more options behind this element, this area can be clicked on, etc. (In my opinion software designers occasionally rely too much on hover states that require the user to move the mouse to an element before discovering additional options. Designers also occasionally place crucial functionality behind a hover state that is inaccessible on touchscreens. This is solved by assuming that the user is OK with being forced into the hover state after a tap which makes touchscreen users less efficient. Tap and hold? That doesn’t work either because Safari and Chrome have built-in functionality for tap and hold. Basically, avoid using hover states.) However, when used properly, hover states can add delight and surprise to an interface.
Now imagine if physical objects had hover states similar to software. Not only would they feel alive and fun, but they would also have more accessible buttons. Your finger would travel a shorter distance allowing you to press more buttons in less time. This could start with Microsoft Excel experts who need to enter data quickly, and it could lead to other innovations in the home. Arming and disarming a security panel for example. Of course the trend is to slap screens on everything, but products that are designed for specific purposes could be improved with this innovation.
I would love to see some of these ideas incorporated into a future elgato Stream Deck.
Last year I convinced the family to dress as Steve Jobs for Halloween which included my 1 year old daughter wearing a very cute pair of tiny New Balance shoes. While holding my daughter (who was chomping on an iPad nano from my Apple Collection) and posing for photos I remembered and reenacted a very specific moment from the iPhone keynote in 2007. Jobs’ presentation remote stopped working, and he had to kill time on stage while people backstage fixed the problem.
He told a quick story that I believe was also discussed in Steve Wozniak’s book, iWoz, where Wozniak built a device that disables nearby TV antennae (it’s hard to believe all TVs used to have them). They pranked students in UC Berkeley dorms by tricking them into thinking awkward poses while holding the antennae would fix the TV’s reception. Jobs demonstrated one such pose for just a second before learning that his presentation remote was fixed and continuing to talk about the iPhone.
You can watch Steve tell this story and catch the pose starting at 1:15:16.
It’s beautiful. The team at Humane deserve tremendous credit for building something with such precision and a keen sense of aesthetics. The design of the Ai Pin and its peripherals like the charging case, attachment options, and battery booster is inspiring. When one reflects on the intersection of fashion and technology only a few companies and products come to mind: Sony with the Walkman, Apple with the iPod (of course), Beats Electronics with the Beats by Dr. Dre Studio headphones, and others. Perhaps Humane will join this list.
I am seriously considering purchasing an Ai Pin. However, I have some questions and concerns:
Using the Ai Pin requires telling everyone a new phone number. This means I will now have two phone numbers: one for the iPhone and one for the Ai Pin. I am obviously not giving up my iPhone anytime soon. This is reminiscent of the 1990s when people had car phones. To reach someone one would call a house line and then a car line. To be honest this was not super common, but I recall a few friends and relatives who had car phones.
How can I see and share photos? I have kids, nieces, and nephews now! Leaving my iPhone at home means no more sharing to iCloud Shared Albums or seeing updates in albums I’m a member of. I’m fine giving up Instagram or waiting until I’m at a computer, but shared albums are too important at the moment. Grandmas need to see photos of their grandkids! I suppose I could take photos with the Ai Pin and then instruct it to send them to the grandmas via text. What about cousins and aunts and uncles and friends?
Organizing the family with my wife is a full-time job. We constantly text each other to ask for help, make suggestions, send reminders, etc. That will become much more difficult without a keyboard. Perhaps we can rely on the Apple Watch to accomplish this?
The Ai Pin lasts only four hours before recharging, so it’s crucial to carry around an extra battery booster (or two?) that is charged. This means one must be vigilant with charging the battery boosters every day and also carry them around. This seems doable on a workday when I have a backpack, but what about the weekend? Currently at the end of the day I place my iPhone and Apple Watch on the MagSafe Duo Charger on my nightstand. I suppose I will need to make space for the Ai Pin charger which can charge both the Ai Pin and one battery booster. What about an extra battery booster? I’m genuinely concerned with this balancing act.
I could go on but the point is clear: the Ai Pin raises too many questions. It seems daunting to make such a drastic change to my daily life. One can argue how additive it is, but it’s certainly also subtractive. Do the AI features outweigh losing access to apps and a screen? How much will my productivity decrease? Is there a future I cannot see yet where AI supercharges productivity more so than innovation in today’s apps?
Take a step back to the late 1990s. We transitioned from encyclopedias and landlines to modems and pagers. Accessing information was limited and cumbersome. Then the cell phone arrived and it was purely additive.
For music we had cassette tapes, CDs, and eventually mp3s. When the iPod arrived we already had computers, a Firewire port, and a collection of mp3s. Most importantly we listened to digital music. Not only was the iPod additive, but it also enhanced an activity we already knew and loved. We were accustomed to carrying around devices that could play music too. The iPod merely replaced them.
The Ai Pin is a drastic change. I predict that the majority of buyers will continue to carry their mobile phones for the foreseeable future. Maybe by 2025 we will see the beginning of a transition to the Ai Pin as a sole device, but Apple and Google will continue to innovate and keep us hooked. Is hooked the right word? It seems too negative for how we perceive our phones.
It’s important to note simply saying that the Ai Pin will improve over time as its AI capabilities develop is not a strong argument. OpenAI’s ChatGPT, Google’s Bard, and others will also continue to innovate and become a deeper part of all the apps we use today. That could make our mobile phones even stickier. We have not even seen what Apple is working on regrading the latest and greatest with AI.
I have a decision to make. Meanwhile I’m so excited for the future.
I love using Blot.im for this website. It’s the perfect combination of customization and simplicity. I’ve tried so many platforms (Wordpress, Squarespace, Tumblr, iWeb, Carbonmade, Jekyll, etc.) and I’m delighted by Blot. However, its best feature is a bit interruptive when I have an urge to write a post. Blot uses Dropbox to host files which means writing starts with making a new file, naming it, and then copying and pasting a few snippets of data (title, tags, and date). I tried using the macOS stationery pad feature, but it didn’t quite work as expected.
I was emailing with Blot’s founder and he recommended I use Shortscuts.app to automate this process and reduce the number of clicks required to post. My goal is to click on an item in the macOS dock and automatically create a text file, name it, save it in the Blot folder on Dropbox, populate it with the required snippets of data, reveal the file in Finder, open Blot’s images folder, and open the file in Sublime Text (or whatever editor I feel like using this month). One click and start writing. The last step will be manually dragging and dropping the file from the Drafts folder to the Posts folder to actually publish it. Perhaps I’ll think of a clever way to automate that step as well.
Let’s begin.
Prompt for a Title
First, open the Shortcuts app and make a new shortcut. Search for the “Ask for Input” action, and drag it into your shortcut. Set the “Ask for” area to “Text” and the area after “with” is what you will see in the prompt. I chose something simple: “Post name?”
Make sure you deselect “Allow Multiple Lines” because if enabled you are unable to use the return key to proceed. Instead you will see a new line appear in the text field.
Populate the Text File with Common Data
Every time I start a new post I manually type out the date, title, and the markdown needed to produce an image (I usually include an image in posts). This is tedious considering the computer knows the date, and I already entered a title.
Start with: Tags: and Date:.
To automatically insert the date:
Right click next to “Date:”
Click “Insert Variable”
Click “Current Date”
Set “Date Format” to “Long”
Set “Time Format” to “None”
The next line will automatically populate the title that you entered in the prompt above. Start with “#” (an h1) followed by a variable you must set:
Right click next to “#”
Hover on “Insert Variable”
Click “Provided Input”
Then I have a few snippets of Markdown to help me get started including:
 for an image I want to share
[]() for a link (I always swap the brackets and parentheses accidentally)
Text to remind me to include a thought
Now you’ll be ready to start writing instead of first setting up a post.
Name the New Document
Let’s take the post name you entered in the first step and apply to the new text file. This step is a bit finnicky so be careful.
Search for the “Set Name” action
Drag the action into your shortcut
Right click on “Name”
Hover on “Insert Variable”
Click “Provided Input”
Immediately type “.md”
Click “Show More:
Deselect “Don’t include File Extension”
Step 6 is important. A blinking cursor appears next to “Provided input” and you need to add the Markdown extension (.md) before clicking elsewhere. Achieving a blinking cursor in this exact spot is difficult to achieve again.
Steps 7 and 8 are also important because the goal is to manually set the file extension as “.md” vs. the default “.txt” which works in Blot but ideally posts use Markdown.
Save the File in Dropbox
Note: I recently updated Dropbox which uses the new macOS File Provider API. I am not sure how this update impacts the Shortcuts and its Dropbox actions.
For “Destination Path” enter the location of your Blot Drafts folder. Mine is /Apps/Blot/David Klein/Drafts. Make sure “Ask Where to Save” and “Replace Existing Files” are deselected.
Open Important Folders
As mentioned above, the last step will be dragging and dropping the draft file from the “Drafts” folder into the “Posts” folder when you’re ready to publish. If your post includes an image then you will also need quick access to Blot’s images folder. You can skip this step if you don’t need folders to automatically open.
Search for the “Run Shell Script” action
Drag the action into your shortcut
Enter open along with the paths for folders that are important to your writing workflow
For “Input” right click on “Saved File” and click “Clear.”
The scripts that I use are:
open "/Users/tehdik/Dropbox/Apps/Blot/David Klein/Drafts"
open "/Users/tehdik/Dropbox/Apps/Blot/David Klein/img"
Pause for a Moment
Let’s revisit our original goal here: writing! The text file that was created, populated with helpful text, and saved to Dropbox should automatically open so you can quickly begin writing. It took a lot of experimentation, but I finally discovered that you can’t open this file yet. For some reason the file system doesn’t see it! As a result you need to pause the script for 1 second. Yes, 1 whole second.
Search for the “Wait” action
Drag the action into your shortcut
Set the value to “1”
Note: The screenshot shows 3 seconds, but I continued to experiment after taking the screenshot and discovered that 1 second also works!
Open Your File
Now it’s time to automatically open your new file.
Search for the “Run Shell Script” action
Drag the action into your shortcut
Enter open along with the path for your Drafts folder
Add a /
Right click after the slash
Hover on “Insert Variable”
Click “Provided Input”
Add .md
For “Input” right click on “Saved File” and click “Clear.”
And there you go! Your new file will automatically open. If it opens in an unexpected text editor you can quickly fix that by setting the default application for Markdown files.
Right click on a Markdown file in the Finder
Click “Get Info”
Find the “Open with:” dropdown
Select the desired text editor
Click “Change All…”
Click “Continue” in the popup
Now all of your Markdown files will open the application you selected.
Summary
Shortcuts is unpredictable. If you use Keynote or Things you know what it means for clicks and keystrokes to perform actions that you expect to occur. Shortcuts is different and, as a result, frustrating to use. Several times I had to start over because I got into a state that I did not understand and could not escape. I also spent time trying to use actions to open folders automatically and failed. It seems funny to use shell scripts for a few steps, but that was the most straightforward way to open folders.
The research needed to figure out Shortcuts was mostly done using ChatGPT. It was helpful even when I was unclear about what to click on. I asked clarifying questions nad ChatGPT answered (and apologized)! All of a sudden searching on Google, scrolling results, clicking on a result, interpreting the unique page design, and scanning text for an answer feels antiquated.
Good luck making a shortcut. Let me know how it goes on Twitter𝕏MastodonThreads!
Nothing combines fun, functionality and just the right amount of nostalgia like the RayCue 128K. It looks like a miniature version of the iconic Apple Macintosh yet delivers modern features. A most versatile addition to your workspace, the RayCue 128K Pro boasts 14 docking ports, bluetooth speaker, and a cool display ready to show your favorite photos, time, date or any image you desire! Utilizing DisplayLink® technology, it also allows your computer to support 3 external displays. Plus, the RayCue 128K comes with a 7-port portable hub that looks just like a keyboard. It makes the perfect gift, office conversation piece or productivity splurge for anyone who remembers Apple’s early good old days!
Backed on Kickstarter! Can’t wait for this to arrive.
In 2011 I started collecting a few Apple products when a coworker who once worked at Apple gave me his QuickTake 150 and 200. I proudly displayed these on my desk and slowly added to the collection with an eMate, Newton, PowerCD Player, and a Cube.
My desk at iControl Networks. Shot on an iPhone 4 in 2011.
Over the past thirteen years the collection has grown through eBay purchases, Craigslist purchases, and friends/family who discover old products in a basement. Apple has made many products since 1976, and I certainly do not intend to own all of them. (Remember the Performa line? Yikes.) When considering adding something to the collection I run through the following questions:
Did I use one at home, school, or a friend’s house?
Is this product aesthetically pleasing?
Is this product important in Apple’s history?
I promise this makes sense in my head.
Recently I realized that I had never documented what I owned and what I hope to acquire in the future. Well, my Apple Collection is ready to share. Let me know if I’m missing anything. Soon I will clean everything throughly and take a few photos of the museum in the garage (the only place I’m allowed to display everything naturally).
Last month I received a DM on Twitter asking if I would like to be interviewed for Lovers Magazine. Naturally I was thrilled and delighted. After many revisions, a few photo shoots of my desk, and some taste input from Remy, I was finally ready to publish. Enjoy!
My relationship with music has evolved throughout my life as a result of new formats, devices, and services. My taste, however, has largely remained the same thanks to a few select rappers and groups. There’s a reason no one asks me to select the music during a dinner party.
I started with cassette tapes. I recall receiving an MC Hammer tape as a gift and not really understanding the point of listening to music. It was fun for a few minutes, but then I would get bored and go back to Legos. Eventually I discovered that one could record songs onto blank cassette tapes from the radio to create custom mixes although that introduced ads and babbling from DJs.
Then came CDs. I received a Green Day album as a gift and encountered the same problem: fun for just a few minutes. The first change that impacted by my listening habits came in the form of a device: the Discman. Now I could listen to music without being tied to my little stereo. Music became a little more fun but I’d still get bored with an artist after a few songs. I would constantly swap CDs to satisfy a mood.
Crucial for getting pumped up for cross country races
One day it all changed thanks to the MP3 format and Napster application. Now I could download any song I wanted at approximately 2-5 kilobits per second. Sometimes I would leave a few downloads running all night and hope that by the time I woke up for school a few had completed. The problem was storage. My Performa 6300’s 1.2 gigabyte hard drive was close to being full, so I stored my MP3 collection on a 100 megabyte Zip disk. This forced me to constantly delete and reprioritize my collection.
Napster running on Mac OS 9 in 2001
MP3s were awesome, but I was tied to my computer to enjoy my music. A device was needed. Around this time I got my first job as an intern at a startup in Palo Alto, CA named gig.com. Their goal was to build an “internet locker” for storing and streaming a music collection. Great idea, but way too early. Since it was a music company every employee was given a Rio 500 which had a whopping 64 megabytes of storage (approximately 14 songs). The fun was back, but one had to constantly manage the device and swap songs slowly using iTunes over a USB 1.0 connection.
The beginning of the future
My neighbor purchased a CD burner which represented another step change. 650 megabytes per CD! Now I could travel with a packet of CDs each ready to play approximately 14 songs. Most importantly: these were my mixes. I made mixes for genres, moods, and occasions. Fun and variety were achieved, but now I had the inconvenience of carrying around a packet of CDs. At some point I got my own CD burner and CD burning software finally allowed one to burn CDs without frist converting to WAV files. Yes, for some time you needed 650 megabytes of hard drive space just to burn the data to a CD.
Burn an entire CD in 18 minutes!
Time for an unfortunate, nerdy misstep. I purchased a MiniDisc player (specifically the Sony MZ-R55). It was so cool looking, and popping those little discs in and out of the player felt futuristic. Each disc held 74 minutes of music, but it took literally 74 minutes fo transfer the data. One of the benefits to MiniDiscs was the ID3 tag data could appear on the little remote you held (or clipped to your clothes) while playing music. The remote had a tiny screen that would display the artist and song name. This sounds insignificant but at the time it was helpful to browse songs by name before hitting play.
When I showed up to college with the MiniDisc player, listening to MP3s on the go was arguably still niche and difficult to navigate. People were amassing large collections on their computers and listening using applications like iTunes and Winamp. Something was missing.
So small, so cool, and so silly
The iPod. At $399 it was a tough sell so I continued recording and swapping Minidiscs for a year while I saved money. I recall initial reactions focused on existing products that had more storage for less money. But, as I’ve repeated to anyone who will listen since first using a Mac, they were ugly, unintuitive, and slow. Instead of up, down, left, right, the iPod allowed you to scroll quickly using a simple, circular motion. Scrolling had acceleration which somehow felt both magical and natural. One could navigate playlists and long lists of songs, and transfer them quickly from a computer. Slow downloads and slow CD burning all of a sudden felt archaic when one watched the speed of file transfers onto an iPod using a FireWire connection. Songs transferred in seconds! My first iPod was the 10 gigabyte second generation model purchased in 2002. The moving scroll wheel was replaced
with a capacitive wheel that didn’t physically move. This was tricky in cold Ithaca winters when one constantly wore gloves outside.
Now we had fun, elegance, beauty, convenience, and speed. The last piece for Apple to fix was the source of MP3 files. We were stuck using LimeWire on the Mac which meant the occasional corrupt file, inconsistent ID3 tags, no album artwork, and breaking the law.
1000 songs in your pocket. But also a Firewire connection.
One more detour is necessary. In 2002 I joined Apple as a summer intern in the Hardware Engineering department. I worked in the Build to Order lab ensuring that new third party hardware worked as expected with current and soon to be shipping Apple software and hardware. This was a dream come true.
Interns were fortunate to meet with and hear directly from executives including Jon Rubinstein, Tony Fadell, and… Steve Jobs. I remember receiving an email saying that the next executive’s name would not be shared. This was it. We gathered in building 4 and in he walked. He talked about many subjects including his personal life when someone asked what his biggest mistake was.
Music came up. He discussed the experience of downloading and ripping music and how it wasn’t good enough. Less than a year later the iTunes Music Store launched. During the announcement I recalled sitting in that room as an intern listening to him talk about owning music in the digital age. He was telling us Apple’s plans almost a year in advance. Incredible.
Buy, transfer, unplug, listen
The iTunes Music Store launched on April 28, 2003. Somehow 20 years have passed. I remember updating to the new version of iTunes, searching for a song, and clicking “Buy.” The song downloaded in seconds. Finally. No more LimeWire. It was now easy to get MP3 files that had high quality album artwork, correct labels, and no random blip or scratch sounds.
I didn’t have a lot of money in college so my collection grew slowly at first. A few months after the launch, Apple announced that 100 million songs were going to be given away as codes in Pepsi bottles. I was more of a Diet Coke with Lime kind of guy at the time, but I knew it was time to make the switch. As a part of my college meal plan I could purchase bottles of Pepsi with lunch, dinner, and maybe a snack or two throughout the day. Not every bottle included a code though which meant a wasted opportunity.
I love when digital meets tangible
People quickly discovered that if you tilted the bottle at a precise angle, you could see if the cap had a sufficient number of characters to represent a code. Boom. My collection exploded. I started buying multiple bottles at a time. I also looked insane standing by refrigerators for several seconds at a time while tilting bottles and carefully looking at them.
For some reason I thought it was fun to save the bottles and organize them into a grid even as the collection grew to over 100 bottles. Eventually it was time to grow up… and just save the caps. I found them in a little box in 2017 when I moved in with my girlfriend (now wife and mother of the cutest kid in the world). She (rightfully) made me throw them away. Fortunately there is photographic evidence.
Taken in my senior year apartment in Ithaca, NY
Now… I no longer feel connected to music. The playlists I so carefully organized disappeared during Apple’s journey from iTunes to Apple Music. My purchased songs are gone too. Perhaps I need to pay $20 per year for iTunes Match? I no longer feel compelled to organize music in Apple Music or Spotify (the Kleins have a family account which is primarily used to play Raffi nowadays). Because music is infinite it feels cheap. Easily discarded. Boring. I assume my feelings towards music are also a result of never being in an environment where music is playing. I also don’t feel I have time to truly listen to music. If I want background noise or if I’m driving/commuting I prefer listening to podcasts. I find them both more entertaining and educational. There are also podcasts to match moods similar to how I used music in college.
Thank you for joining me on this tale of my life through music. Hearing that it was the iTunes Music Store’s 20th anniversary got me thinking about all of these devices, formats, and naturally my bottle collection.
All of my work stems from the simplest of ideas that go back to the earliest civilisations: making clothing from one piece of cloth. It is my touchstone. I believe that all forms of creativity are related.
Issey Miyake passed away this month. My preferred watch between 2013–2015 was a Miyake piece called the TO Watch, and I recently learned it was designed by Tokujin Yoshioka while perusing one of my favorite websites, Minimalissimo.
Although strikingly simple, it was difficult to ascertain the time because the hour and minute hands are reversed. Friends would chuckle when it would take me a few seconds to become confident I knew the correct time after looking down.
Perhaps it’s time to purchase a new model. It’s gorgeous.
When I envision the future of the smart home I see a gradual transition away from cold, hard, and clunky hardware towards a more playful, soft, and delightful aesthetic. A great example is a recent Google experiment called Little Signals:
Little Signals explores new patterns for technology in our daily lives. The six objects in this design study make use of different sensorial cues to subtly signal for attention. They keep us in the loop, but softly, moving from the background to the foreground as needed.
Take a slow walk around your home and notice how you have arranged each area. Your desk, shelf, table, bedside, entryway. You chose objects that feel personal and come together to create a feeling. Why can’t computers, big, small, and tiny, similarly join your other belongings?
Today’s tech products stand out. They make aggressive sounds. They interrupt. They should remain on your periphery and gently draw focus when necessary before quickly fading away.
Connectivity Concept by Deutsche Telekom Design & Customer Experience and Layer
As the tools of domestic technology become increasingly linked together, in a flow that integrates digital and analogue devices, the designs in the Connectivity Concept collection make that connection harmonious and fluid — digitally and aesthetically.
Imagine a family of products throughout your home that feel connected digitally and aesthetically. Google’s latest Nest Cam, Nest WiFi, and Nest Thermostat are pursuing an elevated elegance through a new hardware visual language, and subtle yet fun color options like sand, fog, sage, sky, and coral. As a result I find myself turning to Nest for my smart home needs. Unfortunately the software has not caught up with the hardware’s softness and delightful presence.
DIRIGERA by Ikea
The last example is from Ikea, a company that my family still relies on for basic home necessities. The product photographed above looks pleasant atop a stack of books. No attennas; no flashing lights.
It will take time for Ikea to earn my technological trust, but I am excited to see them participating in this field. Perhaps someday I will venture into the new San Francisco Ikea and demo their WiFi speakers and smart air purifiers.
The most surprising part about this industry? Apple isn’t even mentioned.
The iPod was a groundbreaking piece of consumer electronics. With new generations introduced every year after its launch in 2001, the iPod product family reflected a period of rapid development in processing, storage, displays, and user interfaces, anticipating the iPhone’s blockbuster release in 2007.
This month we explore the evolution of the iPod from the inside out with our Lumafield Neptune CT scanner, guided by none other than Tony Fadell, the inventor of the iPod and the founder of Nest.
The scans of first generation iPod, sixth generation iPod, and the first generation iPod nano are fun to see, but the truly exciting part is the video with Tony Fadell talking about a few details from each iPod. For example iPods originally stored mp3s on tiny spinning harddrives, and simply dropping a harddrive on a table can easily cause damage to the harddrive’s head. I highly recommend watching the video.
When I was an intern at Apple in 2002, Tony told the interns the story of how the iPod came to be. The first iPod had recently launched, and during that summer the second geneartion model launched with 10 gigabytes of storage.
It’s hard to imagine how they designed, built, and launched the original iPod in just 8 months.
I have a long list of Apple products that I believe are unique, desirable, or just unusual waiting to be proudly displayed in my Apple museum. Recently I found myself endlessly browsing eBay (again) searching for something on the list and I came across an iMac in the original color: Bondi Blue. I have a fondness for blue and green Apple products like the iBook, Power Mac G3, and this iMac. I clicked Buy It Now not knowing that I would soon have a delightful email exchange with the owner.
The iMac belongs in the museum because it marks the beginning of Apple’s transition from a boring, confusing, beige 1990s into an artistic, colorful, opinionated 2000s. Steve Jobs had returned to Apple, eliminated most of the product line, elevated Jony Ive, and released the iMac.
I sold many iMacs when I worked at Computerware, a local Apple retail store before Apple had official stores. By the time I was employed there Apple was selling models with new colors (including Blue Dalmation and Flower Power) and FireWire ports for connecting digital video cameras.
The iMac was notable in the computer industry for many reasons:
Prominent Apple ports were removed breaking compatibility with old keyboards and mice. USB and the ability to plug and unplug peripherals without restarting was the future.
There was no disk drive. If you wanted to move a file to another computer you were expected to use the internet.
Plug in power, plug in the keyboard and mouse, and plug in the modem. Simplicity was part of the Jeff Goldblum ad campaign showing off how easy it was to set up an iMac.
Pick it up with a handle. Apple brought back the mentality that a computer should be approached not as a mysterious, heavy box, but as something you could simply pick up and move.
In 1998 computers were clunky and ugly. This is a work of art.
Below is the culmination of the email exchange I had with the iMac’s owner.
This iMac is pretty early in the build cycle. I think it’s an early Rev. B model. It was made early November 1998. I sold a bunch of Bondi iMacs when they came out. The most memorable aspect of being an Apple Reseller in the 1990s was frequently going to Macworld Keynotes (mostly in New York plus one in Boston). The best was in 1999 for the introduction of WiFi and the iBook. The single best computer theater I saw was when Phil Shiller jumped from a window above the stage to a waiting Steve Jobs to show the iBook was not connected to any wires.
He was at the event when Phil Schiller jumped to show that the iBook’s internet connection was wireless using Apple’s AirPort card. Incredible.
I am a computer industry living fossil. I began writing assembler language small subroutines in 1974 on punch cards. Short 10-15 lines of code subs to do a very specific individual task. Assembler was a very compact language. I worked as a Cobol programmer for about a dozen years. I ran large scale projects. My first encounter with Apple was 1983 testing the Lisa. I was invited to Apple in 1986 or 87 to take a look at a new technology they were working on. After being led deep inside a building we emerged into a lab and were shown what was to be called QuickTime. Exciting stuff. In my early corporate days around 1981 I sat in the back of the room for a demo conducted by Phil Estridge of IBM (he and his team died when a Delta jet went down in a storm on approach to Dallas. IBM never recovered from that loss). He was demonstrating what was to become the IBM PC. He had the team from Microsoft showing off DOS… Bill Gates and Paul Allen. Who knew then what was to be.
I’ve never even seen a Lisa and I had no idea QuickTime development started in the late 1980s!
We were at the introduction of the Newton. The wife was at the intro of Apple’s first laptop. We went to the intro of the IIfx. The list goes on.
I have a Newton Messagepad H1000and an eMate 300 in the museum. Clearly I must find more “fossils” out there to hear old Apple stories and purchase their relics.
A couple weeks ago the blogosphere celebrated Mac OS X’s (now macOS) 20th anniversary. It’s comical to reflect on the early days of Mac OS X. I installed Mac OS 10.0 (the first version was 10.0 and each year incremented by .1) on my Power Mac G3 400 (Blue & White) and immediately lost access to my modem. This was before we had DSL so my only connection to the Internet was through America Online. (Napster was extremely slow with a maximum speed of 5 kilobytes per second.)
I found an incredible collection of Apple photos, videos, and promotional materials.
Fortunately I could still access my printer through Classic mode. I had the fast, quiet Apple LaserWriter 4/600 PS which could only connect to a modern Mac using a $100 Ethernet to LocalTalk adapter. I would write in Microsoft Word in Mac OS X, and then open the same document in Classic mode to print. (Eventually Mac OS 10.1 was released and I was forced to purchase a new HP printer that had proper driver support.)
I vividly remember the first time Mac OS 10.0 launched. I clicked and held down on the top of a window, and dragged the mouse to move the window to a new location on the screen. The entire window moved with my mouse! In Classic Mac OS (8, 8.5, 9, etc.) only an outline of a window moved when you dragged it around. Seeing the entire window move was exhilarating. I’m sure this is difficult to imagine after so many years of being accustomed to this behavior. Think of the first time you saw spinning rims on a car. You brain halted and said “Wait that’s possible?” For me it was similar to the moment Steve Jobs swiped to unlock an iPhone the first time.
Podcasters have discussed how slow Mac OS 10.0 and 10.1 were, and I honestly do not remember this. I’m not disputing their statements, but perhaps I was distracted by the lickable (Steve Jobs famously said “We made the buttons on the screen look so good you’ll want to lick them”) Aqua interface, and the anxiety from taking Introduction to Computer Science. (Remember CodeWarrior?) Apple and my G3 were still a big part of my personality, but I was just beginning my college career.
Here are some podcasts that I recommend listening to if you want to feel nostalgic about macOS:
Designers beginning their careers or looking for their next gig often express similar frustrations when working on portfolio case studies: “Where do I begin?!” After writing a few paragraphs they become frustrated and unsure of their progress. The design must be beautiful! I’m a designer, right? My case study better be perfect! Fortunately there is a solution to this problem.
When working on your case study it is crucial that you separate the design from the content. Do not start with a blinking cursor. Do not jump right in to Wordpress or Squarespace or Tumblr (I once used Tumblr to build a portfolio). Look at the screenshot below. How can you possibly juggle how your portfolio should look while trying to write?
Just looking at this gives me anxiety.
Here is how I approach writing case studies. Remember high school or college English classes? Remember essays? Start with a pencil, Google Docs, Pages (I used Pages to write all of my essays at Cornell), or a basic text editor (I use iA Writer) and focus on the content at a high level. I quickly wrote an outline for an app I worked on in 2015. (Sadly it is gone forever because competing against Facebook in the social networking space is not ideal.) Here are some questions to consider:
What is the problem you and your team tried to solve? Perhaps it’s “There are no apps for me to look at my friends’ product reviews.”
What is the story you want to tell? Perhaps it’s “I worked with a small team to build a beautiful app but in the end we learned that we failed to find a real problem.”
What sections do you need? Perhaps it’s introduction, problem, exploration, solution, and outcome.
What images do you need to include? Perhaps you need screenshots of how product reviews look on typical social networking apps, prototypes, illustrations that convey the goals of a few prototypes, marketing materials, and final mockups.
Where will you put images? Avoid using so many images that they get in the way of telling a story.
Write a thorough outline. Getting more information out of your head at this stage will help you efficiently write the details later.
Just fill out each section and prepare your screenshots!
Now convert each line of text into a few sentences or paragraph. Fill in the details. Get into a writing flow. Again: you are not thinking about design. You are focusing on content. Typography and colors come later. Tell your story!
Yes, writing is crucial for a successful designer. Deal with it.
Once you are confident that the content is ready (including images captions because they are a fun way to inject personality), you can select a content management system and browse templates. I’ve used so many (even Carbonmade!) and they all have pros and cons. If you want to discuss in detail please feel free to Tweet me. In my opinion it is wholly unnecessary to design and build a template from scratch now. Even Jekyll has templates (although most are pretty scrappy looking).
OK you’ve selected a template and you’re ready to write. Wait you already wrote everything in Pages (seriously give it a try)! Just simply copy and paste your text and upload your images. When your content is already written, building your portfolio is fast and easy. After inputting everything you can focus on the design. Tweak your typography. Mess with colors. Try different templates. You are free to experiment. You finished the hard part; now comes the fun.
As discussed last week in my post about portfolio presentations, designers inevitably have to share their work when applying for a job. What I witness often is the uninspiring real estate tour (or “harbor tour” if you are from the UK). Designers present a high fidelity mockup and talk about the navigation, content, buttons, personas, colors, etc. They are skipping the most important part: the problem.
Every project, flow, and mockup must be presented as part of a problem/solution pair. What is the problem you are trying to solve, and why is this solution ideal? For example, when I applied to become a product designer at Salesforce, I presented my work at iControl Networks (now Alarm.com). I did not start with an interface; I started with a very high-level problem: security systems are old. They run on POTS, are not connected to broadband, do not support new technologies like Zigbee or Z-Wave, and, most importantly, are difficult to use.
Wait! Is this the right problem? No! This immediately jumps to the technology. Why should the hiring manager care that security systems are old? How can you connect the hiring manager to a real world problem? People purchase a security system because they want to protect their family. But what happens when you leave your home?
Problem: I’m not home, and I want to know that my family is safe.
Solution: Build a system that allows people to quickly know that their family is safe.
By the way, I’m not exaggerating; this is really how I presented my portfolio. One slide to talk about the problem, and another for the solution.
Keynote > PDF > PowerPoint > Google Slides
If you’re presenting at a high level immediately tell them the solution
Family. Everyone can relate to that. After grounding the audience to the purpose of the company, it’s now time to discuss my team’s role. When you configure a home with a dozen sensors, a security panel, a few cameras, a couple thermostats, and a bunch of smart plugs, it becomes overwhelming trying to ascertain the state of your family and home. Instead of showing the final interface, demonstrate the design challenge.
2003 called…
Oh that’s not ideal. Should I jump to the main event here and show the final interface? No! First explain why a lot of data is overwhelming. The hiring manager can’t appreciate the awesome final mockup without first understanding why it’s awesome. In the image above the user is forced to scan a list of devices’ icons and states, and then make an interpretation about the state of their home. That’s time-consuming and frustrating. There must be a better way. Perhaps a symbol.
When you start simple the audience can follow along
One symbol to summarize all activity in the home. The problem here is there’s so much device and sensor activity that you may not be confident you fully comprehend everything after glancing at one symbol. Windows may be open, people may be tripping motion sensors while walking around, the security panel may be in a specific arm state, etc. This can be solved by including a few words to summarize the state of the home.
Still simple!
You can see this concept coming together after a few slides. It’s important to note that the audience has still not seen high fidelity mockups. First I dedicated time to establish context by explaining the problem, and now I’m walking you through how we solved it. By now the audience should have a clear idea of the interface’s structure and purpose. It’s time to add fidelity.
High fidelity time. Check out that gradient.
We arrived. The symbol plus a couple text snippets like “Armed Stay. All Quiet.” tells the user that the security panel is armed, no sensors are tripped, and all devices are online. Here is where you remind the audience what the problem is and explain why this solution is ideal. There’s a lot of data produced by sensors and devices. This interface allows the user to launch the app, glance at the interface, and feel confident that their family and home are safe. We started with a list of devices and their individual states, and ended with a symbol paired with a few words.
Without context mockups are just pictures. They may be beautiful or innovative, but context is what enables the hiring manager to understand what they are looking at and why. A problem/solution pair connects your mockups to a complete user experience. At the end of your presentation the hiring manager should be confident that you can solve her problems.