Cyber Security, Where Rules Rule?

Corporate culture and company values might be buzzwords guaranteed to give you the ich, but how do they apply to cyber security?

Cyber Security, Where Rules Rule?
Photo by Praveesh Palakeel / Unsplash

Corporate culture and company values – buzzwords guaranteed to give you the ich. Phrases like ‘do the right thing’ and ‘always put the customer first’ are beamed into new joiners via ghastly onboarding videos. Yet, unfortunately these statements are almost always devoid of any real meaning or genuine engagement with an organisation’s ground truth.

Yet, after reading No Rules Rules: Netflix and the Culture of Reinvention by Reed Hastings and Erin Mayer, I am wondering if there is some promise in company values after all? In the book, Hastings and Mayer shine a light on Netlfix’s vibrant culture. The book brings up interesting themes for any organisation. It also poses questions on how organisations like Netflix, which thrive on free thinking and creative decision making, might clash with cyber security cultures that have a tendency to introduce controls, processes, and restrictions.

This blog will explore No Rules Rules application to cyber security (or lack thereof), but I first want to outline what I enjoyed about the book.

Thoughts on the Book

Above all, No Rules Rules offers a refreshingly substantive discussion on corporate culture. Netflix has managed to establish a genuinely distinct culture that moves beyond many of the generic and hollow platitudes that are thrown in the corporate world.

I also enjoyed how the book was written. As Netflix CEO, Hastings provides direct insights about the company and how its culture has developed over the years. This is then complimented by Meyer who provides a wider and outside perspective from her academic background.

No Rules Rules examines various facets of the Netflix culture, but three themes really stood out.

First, there is a clear focus on recruiting exceptionally talented people. This means paying above the market rate and making a concerted effort to retain exceptional employees over the long-run. Part of this approach also means being willing to let people go who fail to hit the high standards set by the company.

I am personally sympathetic to this approach. I have long believed that a small team stacked with capable, ambitious, and determined employees will be able to outcompete larger yet more disinterested teams. It’s one reason why I feel fortunate to work in cyber security, where so many view their career as a true vacation and calling. Of course, there is always a risk that this spills into burnout and a team of high performers still needs to be carefully managed. Yet, this connection to purpose and impact within the security community is something to really appreciate.

Second, Netflix has a remorseless focus on farming for dissent and relentlessly seeking peer feedback. The company encourages big risk taking, provided that lessons are learned from failure (no matter how spectacular). Netflix fosters this approach by baking feedback sessions into workflows and leading from the top. Hastings stresses the importance of leadership being as candid as possible about their own failures and deficiencies.

From my own experience, this approach is very normal within academia. Constructive criticism is gratefully received and part of a process to improve research. In fact, most academics would be disappointed to only receive positive feedback and likely assume that reviewers have not read their work carefully enough. This is because critique provides an almost guaranteed way to improve ideas and refine arguments before they are published.

Translating this embrace of constructive criticism into the corporate world can be tricky, however. Critique is often unfairly associated with negative performance reviews. Corporate cultures also tend to be far more collaborative and this can make it awkward to criticise teammates. Furthermore, constructive criticism within the corporate world is more likely to focus on character traits and interpersonal skills, rather than research ideas. This immediately makes the process far more personal.

The third theme of the book that stood was the sheer amount of controls that Hastings has relinquished at Netflix. Whether it be inflexible expense policies, fixed time-off periods, or rigid people management processes, Netlfix has been decisive in questioning how these controls can be minimised (if not removed entirely).

I personally have mixed opinions on this. Perhaps it is the European in me, but I worry that unlimited paid time off will lead to reduced takeup if managed incorrectly. Yet, Hastings is not acting recklessly in abandoning many typical corporate policies. The loosening of controls is accompanied by a buffet of recommendations to navigate this approach effectively.

One thing that really stuck out was the sense of ownership and personal responsibility that falls on Netflix’s employees. For instance, Netflix circulates its quarterly performance results to employees ahead of their public release – an act unthinkable for many public companies given concerns around compliance, data leakage, and insider trading.

Yet, Netflix’s sneak peak of quarterly numbers is accompanied by a concerted education drive. Employees may have early access to sensitive information, but are also made acutely aware of exactly what will happen if this trust is abused (consequences include a medley of losing your job and jail time).

In short, Netflix employees enjoy a distinctly free reign, yet this comes with greater personal responsibility and the risk of stark consequences when trust is abused.

Anthesis to cyber security?

I am a big advocate for taking ownership and personal responsibility. This means Netflix’s culture is very much in line with my own views on management and leadership. Yet, I don't believe that this approach is practical or feasible for defending networks. In fact, a no rules rules approach to cyber security strategy would likely be a complete disaster.

Indeed, what is interesting about many elements within Netflix is how they contradict with an archetypical cyber security culture. Most mature cyber security functions understand that employees are busy and ultimately largely uninterested in security. Staying secure should therefore be made as convenient and seamless as possible. Educating employees on why they should avoid suspicious URLs will always be a worthwhile endeavor, yet security teams should also plan for the inevitable and introduce controls and restrictions that limit the damage for when dodgy links are eventually clicked.

In fairness to Hastings and Meyer, this reality is acknowledged in the book. They make a distinction between cultures like Netflix (where fostering creativity is the priority) and those where safety will come first (where rules and process will be more important).

This is interesting for at least two reasons.

First, this suggests that the culture and approach imposed by most cyber security functions will often run against an organisation’s broader ethos. How do creative workplaces with a focus on big gambles and relinquishing controls coexist with anxious cyber security teams that have contradictory instincts? Perhaps this is best viewed as a healthy and inevitable tension that balances competing priorities within an organisation. However, it does pose a tricky balancing act to for security functions given the importance of maintaining a sense of goodwill amongst employees.

I’m not convinced that the cyber security community talks enough about cultural alignment. Security conferences offer no shortage of autopsies into the latest espionage campaigns or discussions of emerging threats. Organisational culture, by contrast, is often nowhere to be seen or relegated to a ‘soft’ issue at best. Yet, I am often struck by the number of CISOs and security leaders who mention that their biggest challenges come back to navigating thorny office politics and integrating their ideas around security into a broader corporate culture.

There are also question marks here for the cyber threat intelligence (CTI) industry too. While the threat intelligence community has oodles of people interested in analytical techniques and how to avoid bias, we rarely ask if we have a deeper bias to certain problem sets. I.e. a bias towards interesting technical developments as opposed to management and cultural issues that are often dismissed.

The second challenge is that the cyber security industry still needs creativity. In fact, we need it in spades. The majority of security functions are short-staffed and the industry certainly possesses its fair share of employee burnout and fatigue. Innovation and new ways of thinking are more important than ever.

This means cyber security vendors are faced with interesting cultural conundrums. How do they balance their natural instincts to keep networks secure by enforcing security controls alongside an approach that simultaneously encourages the risk taking and free thinking required to think creatively about the next generation of desperately needed security solutions?

Fostering such a hybrid approach is certainly not impossible, but I'm not convinced that we talk about it enough.

Educating Insider Threats

A core principle of a no rules rules approach was ensuring employees clearly understand the consequences of their actions. Although I don’t believe this is effective for cyber security on the whole, it might have interesting applications for one issue in particular: tackling insider threats.

Identifying and preventing insiders is likely to be more important than ever in the year ahead. This is because we are increasingly seeing ransomware operators try to recruit company employees in an attempt to compromise networks.

I have long been dubious about some of the common advice for detecting and preventing insider threats. Much of this seems to come down to looking out for employees who suddenly start working irregular hours or act a bit shifty. Yet not all insiders are perspiring and anxious wrecks with bags under their eyes.

Ultimately, many insider initiatives focus on how to spot insiders through their behaviour. What No Rules Rules highlights is that education can also be used to deter action in the first place.

The majority of insiders will of course understand that their actions are illegal, yet this is rarely clarified by an employer in substantive detail. Education programmes that spell out exactly how illegal it is and the length of a potential prison sentence could therefore have powerful effects.

Yet, despite some potential overlap around tackling insider threats, the overwhelming takeaway from No Rules Rules was that a chasm exists between an innovation culture and many of the essential ingredients of a conventional cyber security approach. I am by no means the first person to discuss this and it is perhaps a slightly hackneyed observation. Yet, it remains vital to explore soluitions and avoid a defeatist mentality.

I am confident that there are ways for both an innovation and security culture to coexist, while also plenty of scope for the security community to innovate itself. Whilst these challenges are not insurmountable, they will not be solved through indifference and neglect. The cyber security community is not above business and cultural alignment conversations. The time for that discussion to get serious is now overdue.