Skip to main content

Does your workflow work offline?

Introduction

For several years, I’ve been noticing that more and more work is done entirely in web browsers. Software-as-a-service solutions are probably present in each and every field related to productivity: from online mail services to collaboration tools for developers. I must say, that such tools are certainly useful. They allow even distributed teams to work together in a very efficient manner, despite the distance. However, I think that too much dependence on this kind of services might in some cases result in a decline of productivity when access to these services is disrupted.

Communication

Communication is a field that might appear to require constant connectivity to function. In the case of the “real-time” channels like Slack or video conferences, this is true. However, communication media like e-mail are far more asynchronous. As a result, they allow us to actually work offline. The e-mail protocol was designed to resemble the “real” mail model. Messages are passed through the “post offices” (some servers on the web that know where to deliver each message) to be finally transferred to our personal mailbox. We can then download messages, store them on the disk and access them even without the Internet connection. Furthermore, the e-mail allows us to compose replies and send them out as soon as the connection returns.

However, the webmail interfaces (like Gmail or Office365) have great popularity among users. Many of my friends identify e-mail with Gmail. Furthermore, the companies like Google and Microsoft seem to support this association. The integration of their webmail clients with other applications in the respective ecosystems grows every year. Just look at the concept of Google Workspace that binds the core productivity products from Google, including documents, calendars, e-mail, etc. Of course, this is pretty convenient. However, this approach disallows the users to work offline. If the connection is poor or down, the user will not be able to access any of the data stored in the service. Do you have any low-priority e-mails or documents to handle in your mailbox during an unexpected outage? Forget it, you will be able to process them when the network is back and you will be able to do your main tasks again. Do you need something noted in an e-mail or a document stored in the cloud? Without Internet access, it’s completely out of your reach. Or maybe your work is based entirely on e-mail? If so, lack of access to the webmail inflicts compulsory break.

One can have access to the “legacy” protocols that power the e-mail ecosystem (namely: SMTP, POP3, and IMAP), and access the contents of their mailbox via an offline desktop client. However, the documentation on using these protocols is not widely advertised and is targeted at “advanced” users. Also, the procedure gets highly complicated when the user enables two-factor authentication.

The notable exception is Apple that ships an actual offline e-mail client with their operating system. The fact that is bundled with the OS and fits in the Apple ecosystem might be a reason why the market share of the client is high. Also, the mail clients for mobile devices allow for offline access to data. However, working from a phone might not be viable or convenient in many cases.

Development

Another field in which web-based workflows storm is software development. Nowadays, no one is bewildered by the services like GitHub or Bitbucket that host the code repositories. The hosting of code itself is resistant to temporary network failures. However, these services usually concentrate much more functionality related to collaboration. For instance, their most important features are pull requests and peer reviews. They are all available only via web interfaces. If the web page cannot be accessed, nothing can be done and the developers need to pause their work. When I was using GitHub commercially, I experienced several such breaks due to the outage.

The truth is, the above setup is only one of the several workflows supported by git. Furthermore, git includes collaboration tools for reviewing the proposed changes. This SCM allows for workflow based on patches that can be sent via e-mail (possibly to a mailing list) and applied by the repository maintainer if everything is all right. This workflow is used in very large, open-source projects like Linux kernel or PostgreSQL database. If we combine this approach with the offline e-mail client, we can perform reviews of changes proposed by our peers offline. Unfortunately, none of the leading repository hosting providers supports this kind of workflow. If the team wants to utilize the patch-centric development, they might set up the mailing list to exchange patches themselves or use a service like SourceHut to host their code. The biggest drawback of ditching the code hosting service in favor of the mailing list is the higher entry-level, which is partially caused by the popularity of the former. The developers need to set up their e-mail clients properly. Furthermore, they need to get used to forming their reviews in the e-mail instead of a familiar web interface, that allows for selecting lines and adding comments.

There is one more trend related to software development that is gaining more and more popularity. I’m talking about cloud-based developer environments, like GitHub Codespaces or JetBrains Space. This might seem very convenient for the developer at first. After all, one needs only a browser to access all the development resources and start coding. Also, the development machine does not need to be beefy as the developed code is run in the cloud. However, this means that if the network is down or the service undergoes an outage the team can do literally nothing. Furthermore, such solutions might force the team members to use the environment exclusively, even if some of them might be much more proficient with other editors or IDEs.

Documentation

Documentation is another thing that is usually accessed online. Even though most of the language distributions ship with the built-in documentation accessible even from the IDE level, the reflex to ask Google for some details remains. This was true for me for a very long time. However, I changed my approach towards seeking solutions to my problems on the Internet after I switched to OpenBSD. Due to excellent and complete documentation in the form of manual pages, I leaned towards using apropos(1) and man(1) utilities. As a matter of fact, I started to use Google less and less as the information already contained on my local machine was enough to solve almost all problems I had. This was especially useful when I was debugging issues with network interfaces. What is more, I started a little side project. As it requires a significant amount of text processing, I chose Perl. Since Perl is part of the OpenBSD base system, the complete documentation is included. I have not written in Perl for so many years, that these docs are something I reach to very often. So far I was not forced to use Google at all while writing my little app.

However, sometimes the tools we use lack good manuals or documentation. This is true for many libraries developed by communities. In such cases, reading through issues on GitHub, StackOverflow questions, or source code for these tools might be the only way to solve problems or learn these tools.

Conclusion

Don’t get me wrong: online tools have a lot of pros. They do not require installation or setup process, they usually work on any hardware and operating system and they can provide a consistent feature set to all team members. However, my claim here is that making our workflows heavily dependant on browser-based applications may render developers unproductive if they are far from a stable internet connection. It is true, that developers nowadays have access to high bandwidth links. On the other hand, outages happen. They can occur for ISPs or service providers. Also, the owners of the online tools we use can go out of business or decide to cease their services. As a result, those who relied too much on such tools will need to switch and learn other tools. Such a change will probably undermine their productivity considerably.

There is also one more, the less evident benefit of not relying on network connectivity in day-to-day work. It is related to remote work. For instance, we can go to some remote location for “workation”. If our scheduled tasks will not require too much synchronous communication like video conferences, we will not need to choose only places with considerable internet bandwidth. Instead, we could choose some nice place, do our tasks offline, start the mobile access point on our phone and sync the results of our work with the rest of the team. Afterward, we could call it a day and take advantage of the surroundings. Isn’t it tempting?