Links

Think of me as a web crawler with taste.

The Democratization of Data Science

Jonathan Cornelissen:

Relegating all data knowledge to a handful of people within a company is problematic on many levels. Data scientists find it frustrating because it’s hard for them to communicate their findings to colleagues who lack basic data literacy. Business stakeholders are unhappy because data requests take too long to fulfill and often fail to answer the original questions. In some cases, that’s because the questioner failed to explain the question properly to the data scientist.

​…

A data-literate team makes better requests. Even a basic understanding of tools and resources greatly improves the quality of interaction among colleagues. When the “effort level” — the amount of back-and-forth needed to clarify what is wanted — of each request goes down, speed and quality go up.

​…

Shared skills improve workplace culture and results in another way, too: They improve mutual understanding. If you know how hard it will be to get a particular data output, you’ll adjust the way you interact with the people in charge of giving you that output. Such adjustments improve the workplace for everyone.

5 Rules of Coaching

Liz Keogh:

As the Peter Principle suggests, we tend to rise to the level of our incompetence… but that’s not actually such a bad thing, as long as we can learn fast, safely. The best way to do that is to make sure things are safe-to-fail, which usually means putting appropriate feedback loops in place. In a human system, that usually means feedback.

Sometimes it’s the simplest thing in the world, and we forget to do it. Clarifying why you want something allows people to make autonomous decisions about how best to work towards the outcome you want; or (even more important) give you information about the context you were unaware of that will cause difficulty getting that outcome.

The Hazards of a “Nice” Company Culture

Timothy R. Clark:

Low-velocity decision making. In a nice culture, there’s pressure to go along to get along. A low tolerance for candor makes the necessary discussion and analysis for decision making shallow and slow. You either get an echo chamber in which the homogenization of thought gives you a flawed decision, or you conduct what seem to be endless rounds of discussion in pursuit of consensus. Eventually, this can lead to chronic indecisiveness.

The Long Slow Ramp of SaaS Success

You need to take a step back and view data on a macro level, not micro. As the founder, you should care more about the trends not the constant, inexplainable anomalies.

One of the really frustrating parts of running a business is that many times we just don’t know the answer to “why?”.

Why did churn go up 10%? Why are trial conversions decreasing? Where did all these new users come from? Why is our growth half of what it was last month?

Many of those questions have no answer and trying to find an answer will cause you to rip your hair out.

Reversible and Irreversible Decisions

Bezos considers 70% certainty to be the cut-off point where it is appropriate to make a decision. That means acting once we have 70% of the required information, instead of waiting longer. Making a decision at 70% certainty and then course-correcting is a lot more effective than waiting for 90% certainty.

Reversible decisions can be made fast and without obsessing over finding complete information. We can be prepared to extract wisdom from the experience with little cost if the decision doesn’t work out. Frequently, it’s not worth the time and energy required to gather more information and look for flawless answers. Although your research might make your decision 5% better, you might miss an opportunity.

How Complex Systems Fail

Eradication of all latent failures is limited primarily by economic cost but also because it is difficult before the fact to see how such failures might contribute to an accident. The failures change constantly because of changing technology, work organization, and efforts to eradicate failures.

Indeed, it is the linking of these causes together that creates the circumstances required for the accident. Thus, no isolation of the ‘root cause’ of an accident is possible. The evaluations based on such reasoning as ‘root cause’ do not reflect a technical understanding of the nature of failure but rather the social, cultural need to blame specific, localized forces or events for outcomes.

Hindsight bias remains the primary obstacle to accident investigation, especially when expert human performance is involved.

So many more nuggets of wisdom in there.

Islands Architecture

The general idea of an “Islands” architecture is deceptively simple: render HTML pages on the server, and inject placeholders or slots around highly dynamic regions. These placeholders/slots contain the server-rendered HTML output from their corresponding widget. They denote regions that can then be “hydrated” on the client into small self-contained widgets, reusing their server-rendered initial HTML.