How to Design Delightful Experiences for the Internet of Things

Follow us


ContentFlo is a knowledge property that captures the trends in digital, technology and marketing from across the industry.

ContentFlo shares perspectives and insights that matter to enterprises in the digital economy.
Follow us

This post was originally published on BY SERGIO ORTIZ – DESIGNER @ TOPTAL

One of the next technological revolutions on the horizon is the emerging platform of the Internet of Things (IoT). The core of its promise is a world where household appliances, cars, trucks, trains, clothes, medical devices and, much more would be connected to the internet via smart sensors capable of sensing and sharing information.

As its presence in our lives grows, the Internet of Things (IoT) will be fundamental to most things we see, touch, and experience—UX design will play an important, if not essential, role in that advancement.

From healthcare to transportation—from retail to industrial applications, companies are constantly searching for new ideas and solutions in order to create new experiences, deliver greater value to customers, and make people’s lives easier and more efficient.

If you think you don’t know what IoT is, you’ve probably already experienced it and just didn’t realize what it was. Home automation hubs like Google’s Home and Amazon’s Alexa, the Nest Learning Thermostat, Philips Hue lightbulbs, Samsung SmartThings, Amazon Go, and fridges that monitor their contents all fall into the IoT category.

Flo, smart residential water systems that monitor water efficiency, leaks, and waste

The next wave of IoT will connect millions of devices across the globe and make homes, cities, transportation, and factories smarter and more efficient. Real-time data produced by hundreds of IoT sensors will change the way businesses operate and how we see the world.

The skills needed in this new paradigm will shift from component thinking to whole systems thinking; from one screen to multiple touch-points. Most IoT systems will be connected to an app, but this will eventually evolve into a multi-interface world, some of it yet to be invented.

Designers must adapt to new technologies and paradigms or risk becoming irrelevant. Experiences that we design for are shifting dramatically—think AI, VR, AR, MR, IoT, and any combination thereof.

Utilizing live streaming data collected from millions of sensors, designers will be tasked with crafting experiences turning that data into something useful via an interface (a mobile app, smart TV, smart mirror, smartwatch, or car dashboard).

There will be huge opportunities for designers in the industrial Internet of Things. Organizations of all types and industries are investing heavily in this space, making IoT growth projections astronomical—to the tune of 50 billion connected devices by 2020.

Graphic by Clarice Technologies

IoT Is Already Here

An example of an IoT ecosystem available today is an internet connected doorbell that has a video camera, speaker, microphone, and motion sensor. When a visitor either rings the doorbell or comes near the front door, the owner receives a notification on their mobile via the app. The owner is able to communicate with the visitor via the speaker and microphone; they can let the visitor in via a remote controlled door lock or instruct a delivery person to leave the package somewhere safe.

SkyBell is a smart video doorbell that allows you to see, hear, and speak to the visitor at your door whether you’re at home,
at work, or on the go.

Another example is Nanit—a connected baby monitor with computer vision. It has real-time HD video with night vision, plus temperature and humidity sensors. It’s app gives you access to recorded and live HD video streams and smart notifications.

The IoT baby monitor Nanit

Implications for UX Design

These new experiences will require new modes of interaction—modalities, yet to be designed. Touch will evolve and expand. Gestures and physical body motion will become a more natural way of interacting with the digital world around us.

The IoT space is ready for exploration and designers need to investigate the potential human interaction models, how to design for them and find ways to unlock value. The focus will no longer be on singular experiences, but instead those that represent a broader ecosystem.

The Myo armband

Designers will become involved during every stage of the design process as it will become more about designing the entire product experience.

They will need to share creative authority during the whole development cycle and effectively influence the outcome of the end product, working in collaboration with an industrial designer—for example, on what that IoT doorbell looks like, how it works, the video and sound between the two parties, and the unlocking and locking of the door.

Five Critical Aspects for Designers to Consider in the IoT Era

1) Prepare for Evolving User Interactions

Google Home connects seamlessly with smart IoT devices so you can use voice to set the perfect temperature or turn down the lights.

Just as touchscreens introduced the pinch, finger scroll, and swipe, we’ll soon be introducing other ways of interacting with IoT systems. We can expect that hand gestures will continue to be used, but we’ll begin to see even more natural movements, such as tiny finger movements, as options for controlling devices in our environment.

Google is already preparing for a future where hand and finger movements will control things in our environment. Its Project Soli is an interaction sensor that uses radar for motion tracking of the human hand.

Radar-sensed hand and finger tracking (Google’s Project Soli)

IoT will no doubt be integrated with VR. With VR, our movements mimic those of the real world. Moving our heads up, down and around allows us to explore the VR world in a natural way. We’ll be able to control our environment through commonly used arm, hand, and finger movements.

Merging the VR experience with IoT opens up many new possibilities. Imagine an Amazon Go VR version—a self-serve grocery store in a VR world where a customer “walks in” and collects items into their virtual shopping cart by picking up their choices from the store shelves with natural hand movements.

For designers, feedback and confirmation are important considerations in this new paradigm as are many of the 10 Usability Heuristics for User Interface Design. Many of these “rules of thumb” will live on:

  • Visibility of system status
  • Match between the system and the real world
  • User control and freedom
  • Consistency and standards
  • Flexibility and efficiency of use
  • Help users recognize, diagnose, and recover from errors

Voice will play a huge role. Even the act of walking will dictate some level of control. As these new controls get more refined and are adopted by users, they will become the standard by which we interact in this space, whether a screen is present or not.

Using Amazon Alexa is as simple as asking a question. Just ask to play music, read the news, control your smart home, call a car.

What about other tactile, sensory or emotive inputs? How will emotions and physiology apply to this space? Designers must get ahead of this new paradigm or risk being left behind.

2) Rethink and Adapt to Interactions of the Future

It’s safe to say that, for example, things like the ‘menu’ in a user interface will in some shape or form always be a part of the experience. And just as we saw the introduction of the hamburger menu once mobile became ubiquitous, we’ll need to explore its evolution (or something similar) more extensively within IoT environments.

You need look no further than wearables like Samsung’s Gear S3 Watch to see how menu controls might evolve.

As we create the UIs of the future and new modes of interaction, we’ll need to make sure we keep in mind the users’ expectations. Designers will still need to follow usability and interaction standards, conventions, and best practices. By evolving from what is already known, the potential of new technologies can be harnessed—innovative UIs can be designed while still maintaining enough familiarity for them to be usable.

In the not-too-distant future, our daily lives will be imbued by micro-interactions as we move from device to device and UI to UI. There will not be just one, but many interfaces to interact with in a multitude of ways as people move through their day. An interaction may begin at home on a smart mirror, continue on a smartwatch down the street and on a mobile in a taxi, and then finish on a desktop at work. Continuity and consistency will play an important part.

As IoT continues to grow and evolve, we’ll encounter never-before-seen devices, new methods of interaction, and many varieties of associated UI. Those of us who design in these new environments will need to strike the right balance between the familiar and the new.

3) Design Contextual Experiences

IoT will achieve mass adoption by consumers and businesses when products are easily understood, affordable, and seamlessly integrated into their lives. This means we need to expand beyond personalization, and begin to infuse context into the experience.

Designing for context has the potential to permeate experiences, making them more meaningful and valuable.

As we design contextual, holistic experiences that will harness the power of IoT, we need to understand that being inconspicuous, far from being a bad thing, may be the goal. When the IoT product knows you, knows where you are, and knows what you need, it will only make itself present as needed. Things will adapt to people, and before we know it, become fully integrated into their daily lives.

As we design UIs for this new paradigm, we’ll need to understand that the human-computer interaction will be dynamic and contextual—and it will change constantly. At times we’ll need to allow for controls, while at others the systems will simply relay data with notifications that are useful in that moment. Each view will be intelligently displayed in the context of that very moment via the most appropriate channel and device. This contextual design would be micro-interaction driven, timely, and purposeful.

4) Design Anticipatory Experiences

One of the most promising characteristics of IoT is the ability to predict and adapt to situations. The old model of singular actions driving singular reactions is evolving at a rapid pace.

It’s going to be more about the output without much need for input.

“Magical experiences” will be born out of awesome combinations of AI, machine learning, computer vision, sensor fusion, augmented reality, virtual reality, IoT, and anticipatory design. Rumor has it, Apple is investing heavily into AR.

We will be surrounded by a growing number of intelligent IoT systems that will automatically do things for us in a predictive manner. For example, after using it a few times, the Nest learns our habits and adjusts intelligently without us needing to get involved.

We’ll begin to see systems that will become increasingly predictive. A simple gesture, movement, or word will initiate a series of useful events. There will be a chain of events that aren’t initiated by people at all, because the system will learn and optimize its actions based on a treasure trove of data. These events could be initiated by a person’s proximity, the time of day, environmental conditions (such as light, humidity, temperature, etc.), and previous behavioral data.

More than ever, deep user research will play an important role in designing experiences that are anticipatory and contextual. Defining personas, observing user behaviors, and empathy mapping—just to name a few UX techniques—will become crucial in crafting sophisticated user experiences that will feel almost “magical” to people.

5) Most Importantly, Make It Useful!

We’re seeing tremendous advancements in the field of IoT and the role that design will play in it is about empowering people in ways that were not possible before. The demand for deeply satisfying, quality experiences will increase with high expectations and standards.

While all of the above is important, we must never lose sight of the fact that it’s about making people’s lives easier. Designing “moments of delightful surprise” in this new paradigm—along with deep empathy for the user—is a skill designers will need to develop. As we look towards an even more connected digital future, connecting us to “intelligent things” in meaningful ways will allow for more efficient interaction, more productivity and, hopefully, happier lives.

Designers will need to design IoT-driven experiences that are contextual, helpful, and meaningful—optimized for people, not technologies.

“Experiences” will trump “things.”

The next step is for designers to become involved, and design the most seamless user experiences for the Internet of Things. Technologies must evolve into “optimizers of our lives.”

In other words, become useful for people.

This post was originally published on BY SERGIO ORTIZ – DESIGNER @ TOPTAL

The Industry Could Do Without Pixel Density And PPI Marketing

Follow us


ContentFlo is a knowledge property that captures the trends in digital, technology and marketing from across the industry.

ContentFlo shares perspectives and insights that matter to enterprises in the digital economy.
Follow us

This post was originally published on BY NERMIN HAJDARBEGOVIC – TECHNICAL EDITOR @TOPTAL

A long, long time ago, I used to make a bit money on the side designing and printing business cards, along with ad materials and various documents. I was young and I needed the cash, and so did my buddy. Some of it went towards new hardware, while much of it was burned on 3-day barbecue binges, fuelled by cheap beer and brandy.

It didn’t take us long to realize the HP and Epson spec sheets, which proudly cited insanely high DPI for their printers and scanners, were as pointless as a Facebook share button on a kinky fetish site. So, we started using cheaper, older hardware other people didn’t want, and put the savings to good use: more meat and more booze. Fast forward twenty years, we still like to work hard and afford the finer things in life, and some of them were in part made possible by tech frugality. We didn’t buy into printer DPI poppycock back then, and we certainly couldn’t care less about display PPI today.

But your average consumer does. Oh yes, most of them still think they can actually see the difference between 440 and 550 pixels per inch (PPI) on their latest gadget. I might have missed a few things over the past two decades, but either the human eye has evolved to such a degree that all kids and many millennials have better vision than ace fighter pilots, or they’re just delusional.

I’ll go with delusional, or at least immature, because I figured out my eyes weren’t that good when I was 15.

In this post I will try to explain what led the industry astray, and what developers and designers need to keep in mind when developing for this new breed of device. You may notice that I have some strong opinions on the subject, but this is not supposed to be a bland, unbiased report on a purely technical issue. The problem was not created by engineers, you’ll have to get in touch with marketing to find the responsible parties.

How Did The PPI Lunacy Get Started Anyway?

One word: Apple.

Apple was the catalyst, but it actually turned out to be the good guy in the long run. The real culprit was the Android mob.

Apple introduced the Retina marketing gimmick with the launch of the iPhone 4, which boasted a small, hi-res display that blew the competition out of the water. In fact, it still looks quite good, and there is a good reason for that: Our eyes couldn’t tell the difference in 2010, and guess what, they can’t tell the difference in 2015.

Most people associate Retina displays with the density of the iPhone 4 display, which was 326 PPI (614,400 pixels on a 3.5-inch display). This is not inaccurate; saying that anything above 300 PPI can be considered a Retina display is more or less correct when talking about a mobile phone. The same metric cannot be applied to other devices because the typical viewing distance is different. Apple’s standard for mobile phones (at the time) was 10 to 12 inches, or 25 to 30 centimetres. The typical viewing distance for tablets is often cited at 15 inches or 38 centimetres, while desktop and notebook screens are viewed from about 20 inches (51 centimetres).

You can probably spot an issue here. Did you use your iPhone 4 at the typical 10-inch viewing distance? Maybe. But what about the iPhone 6 Plus, with two extra inches? Probably not. One of the good things about having an oversized phone is that you don’t need to bring it up to your face to view a notification or a message. Sometimes I don’t even pick my phone up, I just tap it next to my keyboard. Sometimes I pick it up and shoot off a short text without taking my wrist off the table, at a desktop keyboard distance, which is much closer to what Apple had in mind for notebook and desktop screens than mobiles or even tablets.

Being the youngest, insecure kids on the block, the Android gang quickly decided they had to do something about the iPhone 4. The response was swift and came in the form of 720p smartphones, with panels measuring 4.5 to 4.8 inches. When frustrated teens try to outdo someone else, they tend to overdo it, so a generation or two later, 1080p panels became mainstream, and they got bigger, 4.8 to 5.2 inches. The latest Samsung flagship, the Galaxy S6, boasts a 5.1-inch Quad HD Super AMOLED display, with a resolution of 2560 x 1440 and, wait for it, 577 PPI. There is just one thing: The panel uses a PenTile matrix, so many people would argue that it’s not really 2560×1440. Who cares, bigger numbers sell, right?

In Samsung’s defense, the Korean giant did create a use-case for such high resolution screens, sort of. It’s a simple and relatively inexpensive Virtual Reality solution dubbed Gear VR. Google Cardboard is more of the same, but Samsung seems to be taking the whole VR trend a bit more seriously.

The Invasion Of Oversized Androids

There was a bit of a problem with this approach. Like it or not, once you start chasing pixels, you are more likely to end up with a bigger screen. This means more backlighting, more GPU load, more juice, and a bigger battery. And, how many pixels does that leave us with anyway?

Well, for a 720p display, the phone has to render 921,600 pixels for every frame. That goes up to 2,073,600 for 1080p displays and culminates in 3,686,400 on a 1440p panel like the one used in the Galaxy S6. To go from 720p to 1440p, an application processor has to figure out what to do with four times as many pixels with each refresh cycle. This, obviously, isn’t very good for battery life, although Samsung did a great job, thanks to its efficient AMOLED technology and industry-leading 14nm SoC. However, the general trend was a vicious circle and many vendors simply keep making bigger screens, to hide the even bigger battery at the back.

Apple may have started the craze, but the real troublemakers are found elsewhere, hiding behind green droids.

I know what some of you are thinking: “But consumers want bigger phones!”

No. Consumers want whatever you tell them. That’s a fact.

In this case, the industry is telling them they want bigger phones, which are becoming so unwieldy that the next thing they’ll need are smartwatches, so they don’t have to drag their huge phablets from their pockets and purses. Convenient, is it not? Huge phones with insanely high pixel densities are a triumph of cunning marketing over sensible engineering.

Besides, people also want better battery life and we’re not seeing much progress in that department. The industry is tackling this issue with bigger batteries, some of which are almost powerful enough to jumpstart a car. I wonder what will happen when one of them, built by the lowest bidder, decides it’s had enough, springs a leak, or accidentally gets punctured?

That’s one of the reasons why I always found those tabloid headlines about smartphones stopping bullets so hilarious. Sure, it can happen under the right circumstances, but theoretically, you can also win the lottery and get struck by lightning when you go out to celebrate.

Instead Of Pointless PPI, Try Using PPD

While PPI has already been rendered a pointless metric, especially in the era of convertible, hybrid devices, the same cannot be said of pixels per degree (PPD). I think this is a much more accurate metric, and a more honest one at that.

Unlike PPI, which only deals with density, PPD takes viewing distance into account as well, so the same number makes sense on a smart watch and a 27-inch desktop display. Here is how it works, taking the good old iPhone 4 as an example.

The device has a 326 PPI display and it’s supposed to be used 10 inches from the eye, you end up with 57.9 PPD at the centre of the image, going up to 58.5 PPD at the edge. This is almost at the limit of 20/20 vision. However, if you have better than 20/20 eyesight, you could theoretically benefit from a higher resolution. However, on a backlit screen, covered by smooth and reflective glass, with a pinch of anti-aliasing, few people could ever tell the difference.

The PPD formula is simple: 2dr tan(0.5°), where d is the viewing distance and r is the display resolution in pixels per unit length.

Before you start breaking out your trusty TI calculators, here’s an online PPD calculator you can use.

So, let’s see what happens with a 5.5-inch 1080p phablet (iPhone 6 Plus) if we change the viewing distance. At the standard 10 inches, we end up with 71.2 PPD, however, at 11 inches the number goes up dramatically, to 78.1 PPD. At 12 inches it stands at 85 PPD, and at 13 inches we see 91.9 PPD.

Now let’s take a look at some cheap Androids with 720p panels. The visual density of a 5-incher at 10 inches is 52.1 PPD, but since I doubt the distance is realistic (if we are using the same distance for a 3.5-inch iPhone), let’s see what happens at 11 and 12 inches: we get 57.1 PPD and 62.2 PPD respectively. An entry-level 5.5-inch phablet with the same resolution has a density of 47.5 PPD at 10 inches, but at a more realistic 13 inches, we end up with 61.3 PPD. Granted, used at the same viewing distance as the iPhone 4, these numbers look bad, but few people will use these much bigger devices at the exact same distance.

So, why am I changing the viewing distance to begin with? As I pointed out earlier, that’s something most users do without even noticing, especially on Android phones. When I upgraded from a 4.7-inch Nexus to a 5-incher with capacitive buttons, I noticed a slight difference in the way I handled it. When I started playing around with a few 5.5-inch phablets and going back and forth between them, the difference became more apparent. Of course, it will depend on the user; someone might have the exact same viewing distance with a 4-inch Nexus S and a 6-inch Nexus 6, but I doubt it. This is particularly true of Android because the UI is more or less a one-size-fits-all affair and does not take into account the loads of different panel sizes out there. Since I am a fan of stock Android, the difference was even more apparent; Lollipop looks almost the same on a 4.7-inch Nexus 4 and a white-box 5.5-inch phablet.

Apple does it differently. Well, to be honest, Apple didn’t even have to do it until the launch of the iPhone 6 Plus, because it only offered one screen size, which allowed it to optimize the user experience in no time.

Why Pixel Density Matters

Why should developers and designers care about all this? It’s mostly a hardware thing, anyway. Developers have nothing to do with this mess, just let Google, Samsung, Motorola and Apple sort it out.

Devs and designers aren’t part of the problem, but they can be part of the solution.

Like it or not, we have to waste perfectly good clock cycles and milliamps on these power-hungry things. Unfortunately, apart from optimisation, there’s not much developers can do. All mobile apps should be optimised for low power anyway, so that doesn’t make a difference. Designers can’t take into account every single resolution on every single screen size when they polish their designs. At the exact same resolution, they might have to use virtually no anti-aliasing, or moderate anti-aliasing, or go all out with some really aggressive edge-softening. It all depends on the type of device and screen size, not the resolution.

This is trickier than it sounds. Using slightly different settings for 5-inch and 5.5-inch devices with standard resolution screens sounds easy enough, but it would only address one side of the problem. Will a tall, 40-year-old Swede use a 5.5-inch 1080p phone at the same eye distance as a 14-year-old Taiwanese teen chatting with her girlfriends? Of course not.

This, among other things, is why I’ve come to despise PPI. It’s become a useless marketing number; it does not provide consumers with accurate information when they purchase a new device, and from a developer’s perspective, the PPI arms race is doing more harm than good. It’s not making hardware better in a noticeable way, yet it’s making it more expensive and less efficient. It is no longer improving the user experience, either, and in some cases it is even degrading it.

A few years ago, mobile designers had to take into account a few standard Apple resolutions and a handful of Android resolutions and screen sizes. Now, they have to deal with Apple products in more aspect ratios, resolutions and pixel densities. Android, due to its trademark fragmentation, poses a lot more challenges than Apple or Windows (Phone). While the trend has been to inch towards bigger screens and higher resolutions, a lot of Android devices still ship with 4.x-inch screens, and sub-720p resolutions. Add to that a host of legacy devices, and you end up with a pool of green goo.

Ten Easy Ways Of Wrecking User Experience On High-Res Devices

Let’s take a look at how high PPI displays have a negative impact on user experience, starting with hardware and performance issues.

  • Heavy websites are too demanding
  • Battery life and durability may take a substantial hit
  • Effect on storage, bandwidth, load times
  • Games that would otherwise run smoothly become jerky
  • SoC may be throttled, refresh rate lowered

Websites with a lot of demanding content, such as elaborate responsive sites, can be problematic even on underpowered desktops, let alone mobile devices. Five years ago, most of us relied on 1080p desktop displays and the iPhone 3GS had a 480×360 pixel display. Today, most people still use 1080p on desktop platforms, but at the same time they buy 1080p smartphones on the cheap. For some reason, people think it’s OK to place the same strain on a desktop and a $200 smartphone that has a fraction of the processing power. Toptal Software Engineer Vedran Aberle Tokic authored an excellent post dealing with problems caused by responsive websites on mobiles, so please check it out for more details.

Of course, as soon as you start pushing a smartphone or tablet to its limits, battery life takes a massive hit. So, now we have bigger batteries in our phones, and more powerful chargers, and wireless charging, and powerbanks; and we still run out of juice by sundown. This is not just an inconvenience; the battery has to endure more charging cycles, it degrades over time, and now that most smartphones ship with integrated batteries, this poses a problem for the average consumer.

Who cares if your app or website look marginally better than your competitors if they end up draining the battery faster? And, what if your gorgeous hi-res designs end up loading slower, taking up more storage, and sucking more bandwidth than the competition?

Games and other graphically demanding applications might benefit from higher resolutions, but they can also experience nasty performance issues. Casual games that don’t stress the GPU to its limits can look muchbetter in very high resolutions, and they can be smooth even on underpowered hardware. However, 3D games, even casual ones, are a different story.

I am no gamer, and it’s been more than a decade since I was hooked on a game (Civilization, of course). However, I recently discovered World of Tanks Blitz for Android, and experienced a relapse, so here is some anecdotal evidence.

The game is easy to master, fast-paced, doesn’t require wasting hours per match, and it combines my love of history, technology, trolling people and blowing stuff up. Since I never install games on my phone, I tried it out on a 2048×1536 Android tablet, powered by a 2.16GHz Intel Atom Z3736F processor with 2GB of RAM. UX is good; after all, this is a popular game from a big publisher. Prior to the last update, the system would set the graphics preferences automatically and I was happy with overall performance, about 30 FPS in most situations (dipping to 20+ in some situations). However, the last update allowed me to tweak graphics options manually, and then I got to see what I was missing out: much better water shaders, dynamic shadows, fancier particle effects and so on. I tweaked the settings a bit, but had to trade a lot of eye candy for performance.

With that particular hardware platform, the game would have been able to run at maxed out quality settings at 1024×768, at a substantially higher frame rate. In other words, my user experience would be better on a cheaper and slower device, with just one quarter of the pixels. Changing the resolution would obviously solve everything, but it can’t be done.

Reducing the load would also allow devices to run smoother for longer periods of time, without having to throttle their processors, automatically reduce screen brightness and so on. In some cases, hardware vendors even opted for lower screen refresh rates to preserve battery life and reduce load.

This brings us to aesthetics, and ways of messing up UX on hi-res devices that have nothing to do with performance issues:

  • Reliance on rasterised vs. vector graphics
  • Use of resampled images
  • Viewing old low-res content
  • Using legacy apps
  • Inadequate or overly aggressive anti-aliasing

Although vector graphics play a prominent role in design, we still have to rely on rasterised images for a lot of stuff. Vector graphics are more or less useless for everyday content delivery. For example, when developers create a simple news reader app, it might look magnificent, even on a low budget, on all devices. However, if the content provider doesn’t do a good job, the sleek and sharp design will be ruined, with inadequate visual content, such as low resolution images and video, compression artefacts, bad anti-aliasing, and so on. If forced to reuse older images, they may be tempted to resample them, making an even bigger mess.

The same goes for old content and apps. Not all websites look better on high resolution displays; not all websites are regularly updated to take advantage of new hardware. Ancient CSS does not look good on high PPI devices. Older apps can also misbehave, or end up with a broken UI.

Anti-aliasing can be another problem, but one of the ways of making sure it’s spot on is to rely on PPD rather than PPI. Of course, there is only so much developers and designers can do, especially if their products rely on third-party content, uploaded and maintained by the client.

Things Will Get Worse Before They Get Better

During any period of rapid tech evolution, teething problems are bound to occur. The fast pace of smartphone development and adoption has created numerous opportunities, along with more challenges for developers.

This high resolution race won’t go on for much longer; it’s impractical and becoming pointless. High-res screens are already shipping on low-cost devices, and the trend is going to slow down before it comes to a grinding halt. In the end, we will end up with a few standard resolutions from $200 to $1000 devices, and that’s it. There is a lot of room for improvement on other fronts, specifically, battery life and overall user experience.

Still, I think it’s a good idea to keep an eye on market trends and keep track of sales figures, just to be one step ahead and to know what to expect. It’s almost as important as tracking the spread of different OS versions and platform market share.

Unfortunately, there is not much developers and designers can do to tackle many of these issues. In my humble opinion, the best course of action is to keep clients in the loop, make them aware of potential issues beyond your control and issue clear guidelines on content that should be added to websites and mobile apps.

This post was originally published on BY NERMIN HAJDARBEGOVIC – TECHNICAL EDITOR @TOPTAL

Busting the Top 5 Myths About Remote Workers

Follow us


ContentFlo is a knowledge property that captures the trends in digital, technology and marketing from across the industry.

ContentFlo shares perspectives and insights that matter to enterprises in the digital economy.
Follow us

This post was originally published on by  SCOTT RITTER – SENIOR SUPERVISOR @TOPTAL

Picture this…

You meet your friend Jeff for lunch. Jeff’s been managing the development of a new product and it’s about to be released. He’s pumped.

But the product release isn’t the main thing that’s got Jeff so excited. Instead, Jeff can’t stop raving about his new hire, Luis.

Jeff says Luis is the best software engineer, he’s ever had on his team. Luis doesn’t just hit targets, others can’t, he creates and hits targets others don’t even see. He’s virtually always available, always accountable, and brings more to the table than anyone Jeff’s ever worked with. And Luis just ‘get’s it’.

As Jeff goes on, you’re blown away. But you’re really only half listening at this point, as you picture what it would be like to have someone like Luis on your own team…

But then, suddenly, Jeff says something that shatters the idyllic picture in your mind:

“And get this”, Jeff adds as he leans forward, “Luis is remote”.

What? Remote? How can that be? Everyone knows that hiring remote software developers is fraught with challenges. It’s hard enough if they’re relatively local. It’s even tougher if they’re remote. And it can be nothing short of a disaster if they’re overseas. Luis can’t possibly be that good.

Sorry dude. Your stereotypes have just been shattered.

Wow. You don’t know what to think.

Reality check

In fairness, the remote employment stereotypes you’ve had in your mind until now, aren’t entirely unfounded, especially when it comes to offshore workers. Recent years have seen an increasing flood of cutthroat overseas development shops promise extremely inexpensive services and then fail to deliver by even the most basic of professional standards. With so many negative experiences, a growing stigma has developed that all offshore workers are low quality, undependable, and unable to communicate effectively.

To be sure, there certainly are pitfalls to be aware of and to avoid when hiring remote workers, especially overseas. But if the negative stereotype of remote hiring were as across-the-board-true as some would have you believe, then how does one explain the dramatic 80% increase in the remote workforce from 2005 to 2012? The simple fact is that working and hiring remotely is a growing and increasingly successful paradigm.

Case in point:  GitHub. GitHub hosts over 10 million code repositories and recently received $100 million in Series A funding. Not too shabby. And guess what?  Over two-thirds of their employees are remote, distributed all across the globe. Seems safe to assume that the remote employment model is working really well for them.

And then there’s Dell – the IT mega-vendor with annual revenues that were in excess of $62 billion in 2012. Dell must be highly confident in the viability of remote work as part of their core business model as well, having recently announced a goal of having half of its employees work remotely within 6 years (by 2020).

Clearly, there must be more to hiring software engineers and working remotely than traditional stereotypes would lead one to believe. In truth, when employed properly and intelligently, a staffing strategy that incorporates remote team members can be a huge win for everyone involved. Some of the best talent in the world telecommutes and, whether we want to admit it or not, a growing percentage of that talent is overseas.

With that in mind, this post pulls the rug out from under 5 of the most prevalent myths about remote workers, with a specific focus on software developers.

MYTH #1: You get what you pay for.
REALITY: Sometimes you do, sometimes you don’t. It all depends.

Many growing companies and startups are realizing that the very best developers may not be located within commuting distance of their offices. Even if they are, their rates may be ridiculously disproportionate to their skill levels. A local hire in no way guarantees a wise investment.

A key problem with hiring overseas team members is that most employers go about the process of selecting and hiring these remote workers entirely wrong. When most employers turn their sights to hiring overseas, they make the penny-wise-and-pound-foolish decision to exploit the differential in labor costs to its fullest and hire the cheapest labor they can find. Some even fool themselves into thinking they’ve avoided this trap by hiring a not-quite-as-cheap software developer, but they still rarely find in the long run that their time or money was well spent.

When hiring a remote worker, the primary focus (as with any hiring) needs to remain on quality. Any cost savings that result should merely be viewed as an added bonus. Decisions in life that are solely based on economics rarely prove to be wise ones, and the same most certainly holds true for hiring remote software engineers.

Employ the best, not the cheapest, and you’ll be the beneficiary of remote team members who are nothing short of stellar.

MYTH #2: Offshore software developers aren’t as good as domestic software developers.
REALITY: As a generalization, simply not true.

Ever worked with anyone from the U.S. who was fired for underperforming? I would venture to say that we all have. There are great and terrible developers in Atlanta, Chicago, and New York, just as there are in Argentina, Portugal, and Hungary. Quality is not a deterministic function based on geographic location.

And not anything against the good ’ole USA but we actually ranked 30th worldwide (out of 65 countries tested) for math in 2012, and that’s down from a ranking of 25th worldwide in 2009. Not a good sign. And this is particularly significant with regard to hiring software developers, since algorithmic aptitude can often be an essential underpinning to effective software engineering. Don’t get me wrong, we have some of the very best developers here in the U.S., but to say that top developers from other countries are not as good as those in the U.S. would simply not be true. Top developers are who they are because they stay up-to-date on leading-edge technology and because of their commitment to technical excellence, not because of where they are located geographically.

Remember the following truisms:

  • A good remote developer is better than a bad local developer.
  • A great remote developer is better than a good local developer.
  • A top developer is a top developer, regardless of where they are located.

MYTH #3: Differences in culture, language, and time zone are serious problems.
REALITY: They can be, which is why it’s so important to hire wisely.

Yes, the potential for cultural, language, and time zone challenges do exist when hiring a remote web developer or software engineer, but they can certainly be surmounted through a highly exacting recruiting process that centers around an uncompromising commitment to excellence.

Hyam Singer’s post In Search of the Elite Few discusses a methodology for finding and hiring the best software engineers in the industry. That approach is no less applicable to remote or overseas candidates than it is to those who are local.

So with that in mind, let’s examine the cultural, language, and time zone challenges of remote hiring in more detail:

  • Culture. In the context of hiring, when people refer to culture, they often really mean work ethic and moral standards. During the interview process, posing hypothetical “ethical and moral dilemmas” – that don’t have black-and-white, right and wrong answers – is a great way of gauging a candidate’s business ethics and moral compass.
  • Language. There’s no denying that the ability to communicate clearly with our colleagues is essential. Problematic misunderstandings can easily arise from misinterpreted subtleties in language, a problem which is exacerbated when working with someone remotely. It is therefore crucial to thoroughly evaluate communication skills when selecting a remote team member, especially if his or her native language is something other than English. An otherwise stellar team player can prove to be more of a liability than an asset due to an inadequate command of the language.
  • Time Zone. First of all, the U.S. shares its four time zones with numerous other “offshore” cities that have high concentrations of talented software engineers, so “offshore” and “time difference” are not necessarily synonymous. Moreover, as long as the time of day at the remote location is within 5 or 6 hours of your time zone (which is true in a large percentage of cases), you’ll always have at least a few hours of overlapping “at work” time each day.

MYTH #4: Remote developers won’t integrate well with your team.
REALITY: If they’re good, they’ll bend over backwards to prove themselves to be stellar team players.

A sharp developer is not only sharp technically, but is also socially and professionally astute, and is therefore well aware of the reservations and skepticism that you may have.

Moreover, just as it is hard for top-notch companies to find superior developers, it is often hard for top-notch developers to find superior companies to work for, especially remotely.

For these reasons, a high-caliber remote developer will often work that much harder to gain your trust and respect. Show her that trust, grant her that respect, and you’ll have more than a team player, you’ll have someone who’s unflinchingly loyal and committed to the success of your project.

MYTH #5: Qualified remote software developers are next to impossible to find.
REALITY: Not if you know where and how (and where and how not!) to look for them.

Well, if you get an unsolicited email from a company touting offshore resources, and it contains a whole slew of grammatical and typographical errors, that’s probably a pretty good indicator of a source that you don’twant to turn to for offshore hires. Remember, we’re looking for quality.

The sources for remote software engineering team members basically fall into three categories:

  • Offshore body shops. This is, unfortunately, the majority of what’s out there and much of the reason for the negative stereotypes that exist. They’re the ones we referred to earlier that promise extremely inexpensive service and then fail to deliver by even the most basic of professional standards.  Avoid them. Like the plague. Period.
  • Independent consultants. This is where the needle-in-a-haystack challenge comes into play. The quality can vary widely. Some can can be cream-of-the-crop, but many tend to be inferior. The inferior ones are usually fairly easy to detect based on the low caliber of their communication skills or the level of desperation that they exude. The premier ones, on the other hand, can be elusive and hard to find, typically finding work through their own network of contacts. One negative with the best of these remote software developers is that they sometimes overcommit (to avoid a dry spell) and may fall behind on some of their deadlines as a result. After all, it’s hard to be your own marketing department. But that said, if you find one of these aces, and if they make your project a priority, they can be a tremendous asset to your team.
  • International freelance networking sites. A number of international freelance networking sites have emerged in recent years, intended to serve as a marketplace to connect customers and remote software engineering resources around the globe. Many independent consultants utilize these networking sites as a means of augmenting their efforts to market their services. As a result, these sites offer customers a more centralized means of accessing global technical resources. However, these networks themselves are really just focused on providing a marketplace for technical services, rather than focusing on (or vouching for) the quality of the individual services offered.  Accordingly, the challenge here remains that of quality; while some of the resources available through these networks are top-notch, the majority tend to be substandard.To be sure, global networking is a greater challenge – potentially a much greater challenge – than local or domestic networking. But as has been discussed throughout this post, high quality technical resources do exist around the globe. By identifying a core group of stellar software engineers in key remote locations, and then using them as the nucleus from which to build out an ever-growing A+ network, one can realize the benefits of (and offer) a globally-distributed workforce, while minimizing the downside. The company I work for, Toptal, has done precisely that, and is in fact employing this business model with great success.


Great developers live where great developers live. It’s that simple. Many are here in the U.S. Many are in South America. Many are in the Ukraine. No country or region has a monopoly on remote developers.

The challenge, whether domestically or abroad, is to navigate through the masses to identify the elite few. International high-end developer networks are emerging as a highly effective means of finding and tapping into these valuable resources across the globe. Indeed, great developers tend to gravitate toward great developers, wherever they may be.  And that’s a fact.

This post was originally published on by  SCOTT RITTER – SENIOR SUPERVISOR @TOPTAL


Spectacular Crowdfunding Fails And Their Impact On Entrepreneurship

Follow us


ContentFlo is a knowledge property that captures the trends in digital, technology and marketing from across the industry.

ContentFlo shares perspectives and insights that matter to enterprises in the digital economy.
Follow us

This post was originally published on BY NERMIN HAJDARBEGOVIC – TECHNICAL EDITOR @ TOPTAL. Click here to read. Republished with permission.

Before I proceed, let me make it absolutely clear that I have nothing against crowdfunding. I believe the basic principle behind crowdfunding is sound, and, in a perfect world, it would boost innovation and provide talented, creative people with an opportunity to turn their dreams into reality.

Unfortunately, we live in the real world, and therefore it’s time for a reality check:

Reality /rɪˈalɪti/

  1. The state of things as they actually exist.
  2. The place where bad crowdfunded ideas come to die.

While most entrepreneurs may feel this mess does not concern them because they don’t dabble in crowdfunding, it could have a negative impact on countless people who are not directly exposed to it:

  1. We are allowing snake oil peddlers to wreck the reputation of crowdfunding and the startup scene.
  2. Reputational risks extend to parties with no direct involvement in crowdfunding.
  3. By failing to clean up the crowdfunding scene, we indirectly deprive legitimate ideas of access to funding and support.
  4. When crowdfunded projects crash and burn, the crowd can quickly turn into a mob.

But Wait, Crowdfunding Gave Us Great Tech Products!

Indeed, but I am not here to talk about the good stuff, and here is why: For every Oculus Rift, there are literally hundreds of utterly asinine ideas vying for crowd-cash.

Unfortunately, people tend to focus on positive examples and overlook everything else. The sad truth is that Oculus Rift is a bad example of crowdfunding, because it’s essentially an exception to the rule. The majority of crowdfunding drives doesn’t succeed.

How did a sound, altruistic concept of democratizing entrepreneurship become synonymous with failure? I could list a few factors:

  • Unprofessional media coverage
  • Social network hype
  • Lack of responsibility and accountability
  • Lack of regulation and oversight

The press should be doing a better job. Major news organizations consistently fail to recognize impossible ideas, indicating they are incapable of professional, critical news coverage. Many are megaphones for anyone who walks through the door with click bait.

The press problem is made exponentially worse by social networks, which allow ideas to spread like wildfire. People think outlandish ideas are legitimate because they are covered by huge news outlets, so they share them, assuming the media fact-checked everything.

Once it becomes obvious that a certain crowdfunding initiative is not going to succeed, crowdfunding platforms are supposed to pull the plug. Sadly, they are often slow to react.

Crowdfunding platforms should properly screen campaigns. The industry needs a more effective regulatory framework and oversight.

Realistic Expectations: Are You As Good As Oculus Rift?

Are you familiar with the “Why aren’t we funding this?” meme? Sometimes the meme depicts awesome ideas, sometimes it shows ideas that are “out there” but entertaining nonetheless. The meme could be applied to many crowdfunding campaigns with a twist:

”Why are we funding this?”

This is what I love about crowdfunding. Say you enjoyed some classic games on your NES or Commodore in the eighties. Fast forward three decades and some of these games have a cult following, but the market is too small to get publishers interested. Why not use crowdfunding to connect fans around the globe and launch a campaign to port classic games to new platforms?

You can probably see where I’m going with this: Crowdfunding is a great way of tapping a broad community in all corners of the world, allowing niche products and services to get funded. It’s all about expanding niche markets, increasing the viability of projects with limited mainstream appeal.

When you see a crowdfunding campaign promising to disrupt a mainstream market, that should be a red flag.

Why? Because you don’t need crowdfunding if you have a truly awesome idea and business plan with a lot of mainstream market appeal. You simply need to reach out to a few potential investors and watch the money roll in.

I decided against using failed software-related projects to illustrate my point:

Most people are not familiar with the inner workings of software development, and can’t be blamed for not understanding the process.
My examples should illustrate hype, and they’re entertaining.

That’s why I’m focusing on two ridiculous campaigns: the Triton artificial gill and the Fontus self-filling water bottle.

Triton Artificial Gill: How Not To Do Crowdfunding

The Triton artificial gill is essentially a fantasy straight out of Bond movies. It’s supposed to allow humans to “breathe” underwater by harvesting oxygen from water. It supposedly accomplishes this using insanely efficient filters with “fine threads and holes, smaller than water molecules” and is powered by a “micro battery” that’s 30 times more powerful than standard batteries, and charges 1,000 times faster.

Sci-Tech Red Flag: Hang on. If you have such battery technology, what the hell do you need crowdfunding for?! Samsung, Apple, Sony, Tesla, Toyota and just about everyone else would be lining up to buy it, turning you into a multi-billionaire overnight.

Let’s sum up the claims:

  • The necessary battery technology does not exist.
  • The described “filter” is physically impossible to construct.
  • The device would need to “filter” huge amounts of water to extract enough oxygen.

Given all the outlandish claims, you’d expect this sort of idea to be exposed for what it is within days. Unfortunately, it was treated as a legitimate project by many media organizations. It spread to social media and eventually raised nearly $900,000 on Indiegogo in a matter of weeks.

Luckily, they had to refund their backers.

Fontus Self-Filling Water Bottle: Fail In The Making

This idea doesn’t sound as bogus as the Triton, because it’s technically possible. Unfortunately, this is a very inefficient way of generating water. A lot of energy is needed to create the necessary temperature differential and cycle enough air to fill up a bottle of water. If you have a dehumidifier or AC unit in your home, you know something about this. Given the amount of energy needed to extract a sufficient amount of water from the air, and the size of the Fontus, it might produce enough water to keep a hamster alive, but not a human.

While this idea isn’t as obviously impossible as the Triton, I find it even worse, because it’s still alive and the Indiegogo campaign has already raised about $350,000. What I find even more disturbing is the fact that the campaign was covered by big and reputable news organizations, including Time, Huff Post, The Verge, Mashable, Engadget and so on. You know, the people who should be informing us.

I have a strange feeling the people of California, Mexico, Israel, Saudi Arabia and every other hot, arid corner of the globe are not idiots, which is why they don’t get their water out of thin air. They employ other technologies to solve the problem.

Mainstream Appeal Red Flag: If someone actually developed a technology that could extract water from air with such incredible efficiency, why on Earth would they need crowdfunding? I can’t even think of a commodity with more mainstream appeal than water. Governments around the globe would be keen to invest tens of billions in their solution, bringing abundant distilled water to billions of people with limited access to safe drinking water.

Successful Failures: Cautionary Tales For Tech Entrepreneurs

NASA referred to the ill-fated Apollo 13 mission as a “successful failure” because it never executed a lunar landing, but managed to overcome near-catastrophic technical challenges and return the crew to Earth.

The same could be said of some tech crowdfunding campaigns, like the Ouya Android gaming console, Ubuntu Edge smartphone, and the Kreyos Meteor smartwatch. These campaigns illustrate the difficulty of executing a software/hardware product launch in the real world.

All three were quite attractive, albeit for different reasons:

  • Ouya was envisioned as an inexpensive Android gaming device and media center for people who don’t need a gaming PC or flagship gaming console.
  • Ubuntu Edge was supposed to be a smartphone-desktop hybrid device for Linux lovers.
  • The Kreyos Meteor promised to bring advanced gesture and voice controls to smartwatches.

What went wrong with these projects?

  • Ouya designers used the latest available hardware, which sounded nice when they unveiled the concept, but was outdated by the time it was ready. Soft demand contributed to a lack of developer interest.
  • The Ubuntu Edge was a weird, but good, idea. It managed to raise more than $12 million in a matter of weeks, but the goal was a staggering $32 million. Although quite a few Ubuntu gurus were interested, the campaign proved too ambitious. Like the Ouya, the device came at the wrong time: Smartphone evolution slowed down, competition heated up, prices tumbled.
  • The Kreyos Meteor had an overly optimistic timetable, promising to deliver products just months after the funding closed. It was obviously rushed, and the final version suffered from severe software and hardware glitches. On top of that, demand for smartwatches in general proved to be weak.

These examples should illustrate that even promising ideas run into insurmountable difficulties. They got plenty of attention and money, they were sound concepts, but they didn’t pan out. They were not scams, but they failed.

Even industry leaders make missteps, so we cannot hold crowdfunded startups to a higher standard. Here’s the difference: If a new Microsoft technology turns out to be a dud, or if Samsung rolls out a subpar phone, these failures won’t take the company down with them. Big businesses can afford to take a hit and keep going.

Why Crowdfunding Fails: Fraud, Incompetence, Wishful Thinking?

There is no single reason that would explain all crowdfunding failures, and I hope my examples demonstrate this.

Some failures are obvious scams, and they confirm we need more regulation. Others are bad ideas backed by good marketing, while some are genuinely good ideas that may or may not succeed, just like any other product. Even sound ideas executed by good people can fail.

Does this mean we should forget about crowdfunding? No, but first we have to accept the fact that crowdfunding isn’t for everyone, that it’s not a good choice for every project, and that something is very wrong with crowdfunding today:

  • The idea behind crowdfunding was to help people raise money for small projects.
  • Crowdfunding platforms weren’t supposed to help entrepreneurs raise millions of dollars.
  • Most Kickstarter campaigns never get fully funded, and successful ones usually don’t raise much money. One fifth of submitted campaigns are rejected by Kickstarter, while one in ten fully-funded campaigns never deliver on their promises.
  • Even if all goes well, crowdfunded products still have to survive the ultimate test: The Market.

Unfortunately, some crowdfunding platforms don’t appear eager to scrutinize dodgy campaigns before they raise heaps of money. This is another problem with crowdfunding today: Everyone wants a sweet slice of the crowdfunded pie, but nobody wants a single crumb of responsibility.

That’s why I’m no optimist; I think we will keep seeing spectacular crowdfunding failures in the future.

Why Nobody Cares About Your Great Idea

A wannabe entrepreneur starts chatting to a real entrepreneur:

“I have an awesome idea for an app that will disrupt…”

“Wait. Do you have competent designers, developers, funding?”

“Well, not yet, but…”

“So what you meant to say is that you have nothing?”

This admittedly corny joke illustrates another problem: On their own, ideas are worthless. However, ideas backed up by hard work, research, and a team of competent people are what keeps the industry going.

Investors don’t care about your awesome idea and never will. Once you start executing your idea and get as far as you can on your own, people may take notice. Investors want to see dedication and confidence. They want to see the prototypes, specs, business plans, research; not overproduced videos and promises. If an individual is unwilling or incapable of making the first steps on their own, if they can’t prove they believe in their vision and have the know-how to turn it into reality, then no amount of funding is going to help.

Serious investors don’t just want to see what people hope to do; they want to see what they did before they approached them.

Why not grant the same courtesy to crowdfunding backers?

This post was originally published on BY NERMIN HAJDARBEGOVIC – TECHNICAL EDITOR @ TOPTAL. Click here to read. Republished with permission.

Bootstrapped: Building A Remote Company

Follow us


ContentFlo is a knowledge property that captures the trends in digital, technology and marketing from across the industry.

ContentFlo shares perspectives and insights that matter to enterprises in the digital economy.
Follow us

This post was originally published on Click here to read the post on       Published by JAN SCHULZ-HOFEN – FOUNDER & CEO @ PLANIO

If you ask me, working remotely rocks. I’m currently writing from a small beach bar located on a remote island in southern Thailand. Looking up from my laptop, I see nothing but the endless ocean and its crystal clear blue waters. I’ll be enjoying this morning undisturbed and focused on my work because the rest of the team hasn’t even gotten up yet. Time zones work out really well for distributed teams.

My colleague Thomas recently talked to 11 thought leaders in project management about the impact of remote work on a company; some scrum experts argued that distributed teams could work together effectively while others came out strongly against it.

I understand the concerns; you can’t just open up the office doors and release everyone into the wild. It’s not guaranteed that you’ll end up with a thriving business. Marissa Mayer at Yahoo famously axed remote work in 2013 after feeling that some employees abused it.

So how does a tech company get this working remote thing right? Read on. The following is based on our story at Planio and how we made it work.

Enter Planio, my remote company
There are a number of things which motivated me to start my current company. Breaking away from client work while retaining all the benefits of being a location independent freelancer was one of them.

In 2009, I was sitting in the shadow of a cypress grove situated in a beautiful Mediterranean-style garden overlooking the rolling hills of Tuscany, working hard on a new side project of mine: Planio.

It’s a project management tool for people like me: developers. Planio helps make client projects more organized and transparent all while reducing the number of tools and platforms needed to do the job. Planio is based on open-source Redmine (an open source Ruby on Rails-based software project), which I’ve used remotely with my own clients since its very beginnings. So, in a way, remote work is already in Planio’s DNA.

Fast forward to today, and my small side project has grown into a real company. We’re a team of 10 now, serving more than 1,500 businesses worldwide. We have an office in Berlin, but many of us work remotely.

In this article, I’ll dig into the principles, tools and lessons that have helped us along the way. After reading it, I hope you’ll be able to architect your software company so it’s remote-friendly right from the start.

“Talk is cheap. Show me the code.” – Linus Torvalds
Every Thursday we have an all-hands conference call where we discuss what we did the previous week and what’s coming up next.

At the beginning, we spent a lot of time discussing ideas before deciding on what to do, but we found that it’s a lot harder when some team members are on a poor quality telephone line and you can’t see them.

Now, we often just “build the thing” and then discuss it – we create a working prototype with a few core ideas and then discuss that. For instance, we recently hit some performance issues with our hosted Git repositories. Instead of discussing and analyzing all the possible ways in which we could potentially save a few milliseconds here and there with every request, my colleague, Holger, just built out his suggested improvements in a proof-of-concept on a staging server to which we directed some of our traffic. It turned out well and these ideas are going into production.

This method focuses everyone’s minds on action rather than talk. The time invested in writing code is paid back in less time spent talking in circles.

Use Text Communication
Real-time communication punishes clarity. Instinctively calling a colleague when you need something is very easy, but it’s not always your best course of action. I can’t remember the number of times I’ve started writing an email or a Planio ticket for a problem only to solve it myself just while writing it down.

Zach Holman, one of the first engineering hires at GitHub, agrees: “Text is explicit. By forcing communication through a textual medium, you’re forcing people to better formulate their ideas.”

Text communication also makes you more respectful of each other’s time, especially when you’re living multiple time zones away. Immediate communication can be disruptive; the person might be in the middle of figuring out why the last deployment went wrong. With an email, s/he should be able to consider your write-up at a more convenient time.

Be as Transparent as Possible
Time spent worrying about office politics isn’t conducive to shipping working software, and transparency promotes trust. It’s no coincidence that many remote-by-design companies, such as Buffer, have radical transparency. In the case of Buffer, it shares revenue information and the salaries of all its employees.

Automattic, the company behind, also emphasizes transparency. In his book, The Year Without Pants, Scott Berkun shares his experience working remotely for Automattic, and that all decisions and discussions are internally available to employees in its P2 discussion platform as part of its emphasis on transparency.

The chat feature in Planio works in a similar way. Discussions are open for everyone to see and chat logs are linked automatically from the issues discussed so nobody is left out; even new hires can read up on what previous decisions were made and why. When I started building the chat feature, I considered adding a feature for chatting privately with others, but when we discussed it as a team, we ended up leaving it out because we wanted to keep team communication as transparent as possible.

I think transparency is critical for remote teams. For example, imagine you’ve just joined a team of remote developers. Perhaps you’ve never met your new colleagues. You don’t know the unspoken rules of behavior. You might be worried about whether you’re doing a good job. Are your teammates actually being sarcastic or do they really mean their compliments? Is everyone privately discussing how good of an engineer you are?

Digitalize Your Systems
We choose our services based on what they offer by way of online platforms, from telephone providers to banks (many of them will even offer a small financial incentive for going paperless, plus it’s great for the environment, too). I’m lucky to have a lawyer and an accountant for Planio who are comfortable sending emails or messages with Google Hangouts instead of summoning me to their offices. (I strongly recommend you ask about this at the first meeting.) Bonus points for getting them to sign up with your project management tool and become a part of your team!

We’ve even digitized our postal mail; at Planio, we use a service called Dropscan that receives our letters, scans them and forwards the important ones to the appropriate person. You don’t want to your friend to pick up and read them out over Skype. If you cannot find a mail-scanning provider in your city or country, some coworking spaces offers virtual memberships to maintain a physical mailing address while you’re away.

For those companies sending out mail, there are services available so that you never have to visit a post office again. We use a German printing company with an API that automatically sends a letter along with stickers to each new paying Planio customer. It’s something people love, and we don’t have to print and mail a thing. International alternatives include Lob and Try Paper.

Should You Have a Digital Presence Mandated?
In a co-working space on the tropical island of Koh Lanta, Thailand, I noticed that someone in a support role for a major e-commerce platform was constantly on a live video feed with the rest of the team. Sqwiggle offers a similar “presence” functionality for remote teams.

I suppose mandating that all employees are on video while working might be based out of a fear that employees abuse remote work arrangements. In my experience, that’s not the case. At the tropical co-working space, there’s a certain urgency in the air, despite the laid-back clothes and coconut drinks. People are quietly focused on their laptops; it’s as if they want to make sure remote work delivers results, so they can stay out of a fixed office for good.

We found that we don’t need a digital presence because we have a great level of trust among everyone on the team. I also think that it’s paramount to respect everyone’s privacy. If your company is moving from an all-on-site setting to remote work, a digital presence might help the more anxious managers to overcome any trust issues.

Choose Bootstrapping over Venture Capital
Most venture capitalists are looking for outsized returns, so they’ll prefer an intense short burst of 12-months’ work from a team over a more sustainable pace. Front App, a startup funded by the Silicon Valley accelerator Y Combinator, rented a house in the Bay area for their three-month stint in the Y Combinator accelerator program. The goal is to optimize for evaluating a business idea quickly.

Given the outsized return mindset, you may have a hard time convincing a venture capitalist to fund you when you’re working from a beach in Cambodia. This is why many venture-backed startups (such as Buffer or Treehouse) that use remote work built leverage first. Buffer was profitable before taking on investment while Ryan Carson, the founder of Treehouse, had already proven himself with a previous startup.

Here’s a better way than venture capitalism: bootstrapping. It means financing your company with revenue from initial customers. In my opinion, it’s by far the superior approach because it enables you to build your company on your own terms and remain in control. However, it often requires working two jobs or freelancing on the side while you get your company started. It took me about two years working on both Planio and client projects (via my software development agency LAUNCH/CO) to get going, but it was well worth it.

Bootstrapping also forces you to build a business that generates revenue from the very beginning, which I find much healthier. Hint: Building a B2B SaaS makes this much easier than creating a consumer app because businesses are far more willing to pay monthly subscriptions if it adds value. You have to sell a lot of consumer iPhone apps at $0.99 to cover monthly payroll for even the smallest of teams.

Price Your Products Strategically
One of our first clients was a massive technology company with billions in annual revenue. Obviously, I was delighted that they’d choose us over much bigger, more established competitors. They’re still a happy customer, but we have moved away from very large enterprise accounts; I’ve found that they require a lot of hand-holding and in-person meetings before they’ll become a customer.

As Jason Lemkin points out in his article on scaling customer success for SaaS, when you have big enterprise accounts, someone will have to get on a jet to visit them twice a year. If you’re a small company of two or three people, that person is going to be you, the CEO, the CMO and the CSO all rolled into one overworked hamster.

Keeping your pricing model within the rough bounds of a $49/$99/$249 model as suggested by developer-turned-entrepreneur Patrick McKenzie means avoiding having to hire an enterprise sales team, and having to earn the massive amount of capital required for it. You, the customer, don’t expect the CEO to pop in at Christmas with a box of chocolates when you’re paying $249 a month.

Build on Open Source
A venture-backed business based on proprietary software is great when your play is a “Winner Takes All” game and own the market. When you’re a bootstrapped company, open source software can give you reach and leverage you could never have achieved, otherwise.

There’s precedence of profitable tech companies building a business around open source software; Basecamp famously open-sourced Rails, guaranteeing themselves a supply of highly qualified engineers for the rest of eternity. GitHub has become a unicorn, leveraging the open source project Git that Linus Torvalds started to manage the Linux kernel sources. Our friends at Travis-CI started as an open source project, ran a crowdfunding campaign and then turned it into a remote-focused bootstrapped business (which also campaigns for diversity in tech through its foundation).

Planio is based on Redmine and we contribute many of our features and improvements back to the community. This works great in multiple ways; our contributions and engagement in the community help advance the open source project and Planio gets exposure to potential new customers. For us, it’s the most authentic way to build a brand; by showing our code and taking part in open technical discussions, we can demonstrate that we know our stuff!

Hire Proven Professionals
Hiring a fleet of interns every year makes sense only if you’re intent on scaling up your employee count as soon as you hit the next round of funding.

Outsourcing tasks is easy if it’s copy-and-paste, but you don’t want to outsource your DevOps to someone with the lowest hourly rate when you have thousands of customers relying on your servers. You’ll want proven professionals, such as those at Toptal.

Matt Mullenweg, the founder of the popular open-source blogging platform WordPress, stated that by focusing on quality means that his company, Automattic, predominantly hires experienced candidates who can handle the unstructured working environment of a remote company.

That means it “auditions” candidates by paying them to work on a project for several weeks, then hire them based on performance. Automattic has found this method is far more effective in finding the right candidates than traditional CVs and cover letters.

Emphasize Quality of Life
Work takes up a massive amount of our time, year in and year out. It should not be something that you just do to be done with; you’d probably end up wasting a huge chunk of your life. The best source of motivation and the main ingredient for great results is a work environment that’s inspiring, enjoyable and fun. Travelling, learning and engaging with people from different cultures makes work feel less of a sacrifice or necessary evil (at least in my life) than when working a nine-to-five office job.

It’s not just about travelling the world, though, there’s the personal freedom aspect. Parents get to spend more time with their kids, thanks to avoiding a two-hour commute. You don’t have to live in Silicon Valley to earn San Francisco wages. Maybe, your significant other gets a great job opportunity abroad, too. You’re not faced with the painful choice between staying at your job and continuing your career or becoming a “trailing spouse” with limited career options.

At Planio, even though many of us work remotely, we all try to meet up at least once a year in a fun location. Last year, we spent a few weeks of summer in Barcelona, and several of us met here in Koh Lanta, this year. I’m still looking for ideas for the next destination, so let me know if you have any travel tips!

What tools, ideas or techniques have you found that make working remotely easier and more effective? Leave a comment below.

This post was originally published on Click here to read the post on       Published by JAN SCHULZ-HOFEN – FOUNDER & CEO @ PLANIO