Monday, November 7, 2016

GHC16 session notes

Brief notes from other sessions:

Blockchain and the Internet of Things:

Issues with traditional databases:
a) Transactions can be suppressed/deleted, DB admins have all the powers.
b) All the risk is concentrated on the DB owner.
c) Integration costs are very high
Solution: Blockchain technology
a) All transactions visible to all participants; visible in ledger copy. Transactions can’t be deleted/modified.
b) Risk is distributed
c) Lower integration costs
HyperLedger project: Trying to standardize distributed ledgers across multiple industries.
Blockchain and IoT usecases:
a) Tracking container locations
b) Track water heater management
c) Use connected devices to trigger payments when a certain usage level is reached
d) Vehicle maintenance records to keep track of maintenance schedule/parts replacement, etc

Gender Equality Index: The Power of Data to Drive Change

A McKinsey report shows that if gender equality is advanced, about $12 trillion can be added to the GDP by 2025. Another report shows that profits increased by 15% when the number of women in manangement increased (from 0->30 %). So Bloomberg launched the BFGEI (Bloomberg Financial Services Gender Equality Index) in May 2016. The framework steps:
a) Member selection/weighing
b) Index return daily calculation
c) Index impact/consumption
Bloomberg Gender Equality index survey:
Index data modeled in a 3-D cube (time, company, characteristics)
Impact: GEI firms have recruiting strategies for women, may sponsor financial education programs

Containerization of Applications: What, Why and How

Containers are popular because of:
a) Flexible in deployment
b) Efficiency
c) Low maintenance
d) Low cost
Best practices for building container-ready apps:
1. Don’t freak out if instances crash – they are disposable. Use container orchestration engine
2. Don’t crash – keep trying again. Return a suitable status code (ex: 503)
3. Containers use COW filesystems – once the container is gone, so is the FS. Use persistent storage only when really needed.
4. Persistent data is special, so don’t log to files. Use stdout/stderr
5. Don’t bake security keys into the image!
6. Services may not be co-located: DB, frontend, API have different needs.
7. Design for scale from the start.
8. Add liveness/readiness checks – is the app running? Has it finished starting up? Is it in maintenance mode?

Saturday, October 29, 2016

So You Want to Hack the Planet - Demystifying Careers and Opportunities in Cryptography, Security & Privacy

Eleni Gessiou (Security Engineer, Facebook), Natalie Silvanovich (Security Researcher, Google), Nadia Heninger (Magerman Term Assistant Professor, UPenn), Sandy Clark/Mouse (PhD student/Senior Research Staff, UPenn)
Moderator: Sarah Harvey (Security Software Engineer, Square)

This session was a path-breaker in its own way, because security wasn't a subject that was talked about much at previous Grace Hoppers. Each of the panelists gave a brief introduction, talking about what she does in the security field. Nadia's main area of research is applied crypto, particularly in breaking crypto :-).  Most of her work involves network security and sometimes, applied mathematics. She loves the fact that security/crypto span the whole CS stack, giving her the opportunity to work on a broad swath of problems. Eleni, who is a security engineer at Facebook, works on detecting suspicious behavior on FB. Natalie works at Google, on Project Zero, where her main task is to find zero-day attacks. She got into security quite by accident - first through a high-school project and then by applying for a junior hacker position while at university. Sandy Clark (whose hacker handle is Mouse) has had a long-standing interest in ethical hacking and cybersecurity in general. Her interests are wide-ranging, but to sum up, she spends her time figuring out how systems actually work as opposed to how they were designed to work.

Sarah kicked off the discussion by asking the panelists about the social/technical challenges they face in the field of security/privacy. Mouse is concerned about how to measure security - there are no laws to figure out how to use technology in a sufficiently secure way while benefiting individuals. There are a ton of security problems other than just "Is there a bug in my code that someone can exploit?". One of the things that Natalie finds in her day-to-day work is that finding zero-day exploits is incredibly taxing and difficult. On a larger scale, code is error-prone; that's just a fact; but how to make sure that developers avoid  making security bugs and how orgs can teach their developers this is something that hasn't been fleshed out. In her work at Facebook. Eleni has to deal with people from different backgrounds and cultures, and finds that it becomes difficult to make sure that bugs are effectively communicated. Often. Facebook ends up having to provide tailored solutions for each problem, rather than providing a holistic solution. Nadia's main concern is the field of crypto. For a while, folks were complacent about crypto, believing that we had good algorithms that were hard to break.  But the Snowden docs revealed that there's a lot more to the security/crypto field than just that. There has been at least one crypto standard that was revoked in the recent past because of allegations that there were backdoors introduced by the NSA. So now the question is what do we do if those algorithms that we thought were fool-proof actually have backdoors? What do we tell users, and how do we get governments to do the right thing?

Sarah's next question was what the biggest vulnerability in authentication protocols that are used on the Internet is. Mouse's answer - users. Natalie believes that the biggest problem is when people try to roll their own crypto. Eleni and Nadia trump for phishing and passwords.

The next question was what the central themes in security are in terms of job opportunities. Natalie's answer: product security folks (review people's code, try to secure products), product managers (who work with customers to figure out security requirements), customer response engineers, security development, etc. Eleni added that there are also teams that work on protecting corporate assets vs those who try to protect users. Mouse believes that there are several areas of interest - social engineering, bio-hacking, hardware hacking, malware detection, etc. Sarah added that pen testers also play a big role in securing products.

An audience member asked if there's a way to bridge the gap between security developers and policy makers. Nadia and Mouse find that crypto conferences are usually very effective in doing this, since they are attended by security researchers, companies and government agents too. Mouse also suggests that anyone who is interested should get in touch with local politicians to become a technical advisor in the field of security.

Another audience question was how to make sure that there's a balance between the amount of data that's given out and keeping that data secure. Nadia believes that crypto's not the answer, but regulation is. Mouse is currently involved in research in this very area, and recently wrote a law review paper on this. She believes that regulation is necessary to bring together the 3 different stake-holders - individuals who give out the data, companies that collect the data for their business model, and the government.

Some security hygiene principles that the panelists recommend: using secure passwords for every site, disk encryption, using 2-factor authentication, ensuring that communication is encrypted end-to-end, and updating to security patches on a regular basis.

Another question was how security can best be incentivized - should products be given security ratings? Nadia answered that at some point, security policy is going to have to become like health policy. This is a long-term problem that only govts can solve - perhaps the FTC can start going after organizations that are notoriously lax in security practices. Natalie agrees, and believes that coming up with security metrics is going to be a difficult and long-drawn-out process.

The panelists were then asked to name what they love, and what they hate most about their jobs. Mouse said that she has had to get used to failing a lot and being frustrated - but all of that is dwarfed by the awesomeness of getting things to work. Natalie loves that she gets to play with lots of cool new technology - but then the stressful part is filing bugs and having to deal with people who are frustrated by those bugs :-). Eleni finds it fulfilling that she gets to use her tech skills to actually do good for other people. Nadia likes being able to break crypto - something that's a total breakaway from her image as a quiet, good kid back in school :-).  She finds it especially rewarding that as an academic, she actually gets to talk openly about all the research that she gets to do.

To get into the security field, the panelists recommended taking courses (even online ones), attending hackercons, or participating in bug bounty programs (the best part - you get paid for finding bugs! :-)). One could also start contributing to open-source software, or even just apply for a job in the field.

Thursday, October 20, 2016

GHC16: Building a Self-Driving Car: From Diverse Perspectives

Min Li Chan, Jaime Waydo, Kimberly Toth, Wan-Yen Lo

Despite the fact that this talk was at the tag end of a long day, it was one of the most heavily-attended. Having learned through experience this morning that people start queuing up for sessions really quickly, I armed myself with a laptop and one of the ice-creams that were being handed out, and prepared to stand in line 45 minutes ahead of time. There were tons of other people who had had the same forethought, and already the line snaked half-way round the building.

Jaime Waydo kicked off the talk with a simple illustration to show that the concept of self-driving cars isn't new - there were ads, even way back in the 1950s, that showed a car driving itself while the passengers sat gathered round a table, playing board games. Its only recently though that things have progressed enough to start putting the idea into execution. One reason X is so passionate about the project is because the stats showing that humans are accident-prone are overwhelming. Over 1.2 million people die every year in road accidents, and more than 90% of those are due to human error. Solution: take humans out of the equation. There's an additional, equally compelling reason: self-driving cars would mean that thousands of people over the world who have lost their mobility, due to various reasons ranging from failing eyesight to old age, would be free to move around again.

X already has a couple of prototype cars ready that are being tested out in different places like Seattle, Austin and Mountain View. (Fun fact: the team has different nicknames for the cars, Marshmallow Car and Bubble Car being the current favorites.)

One of the first and most complex problems the team had to solve was how to pinpoint exactly where the car was at any point in time. Using GPS wasn't accurate enough, so the team decided to use a system of prior-made maps and sensors. Next, they needed to figure out how to detect both dynamic and stationary objects encountered on the road. For this, the car is equipped with several sensors that give it a 360-degree view up to a distance of two football fields. The car also needs to be able to predict what objects around it will do next - for example, if it detects a construction zone ahead, with some lanes closed off, it should be able to figure out that cars ahead of it will soon make lane changes. This is where machine-learning algorithms come into play. The project also needs several mechanical engineers to work on a myriad of things, from back-up systems for steering and braking, to computer systems specifically for self-driving, to the various sensors used.

The team then showed videos of scenarios they encountered while out on the road on test drives - in one instance, the car stopped for a group of people jumping across the road, while in another, a cyclist suddenly became unnerved and made an abrupt U-turn (a human driver narrowly missed him, but the self-driving car was able to come to a halt at a safe distance).

The session then became a little more interactive, with audience members invited to put themselves into the shoes of the self-driving car team to solve different problems the car might encounter - snowstorms, crowded pedestrian crossings, seasonal changes, and so on.

More information on the self-driving car project is available here:

(This post was syndicated from

GHC16: How the World Bank uses Social Network Analysis to Support Entrepreneurship

GHC16: How the World Bank uses Social Network Analysis to Support Entrepreneurship

Kathy Qian, Data Scientist and Innovations Lab Consultant, World Bank

Kathy's session centered on the work she does for the World Bank in gathering and analyzing data to support entrepreneurship, particularly in developing countries. The World Bank is a consortium of about 180 countries that have pooled money and resources together with the expressed intention of eradicating poverty - so while the World Bank is a bank, it doesn't really act as one :-). Given that goal, its a little hard to see why the World Bank cares about tech start-ups. There's a simple explanation - policymakers around the world are very interested in creating jobs. And one sector that's been churning out more jobs than any other is the tech sector. In point of fact, only about 10% of those jobs actually involve directly working on technology - often, the jobs are created as a by-product of something that happens in the tech sector. For example, in the recent past, tons of people have been transforming themselves into Uber drivers and AirBnB hosts.
Even accounting for population, Kathy's data-set shows that growth rates in entrepreneurship are different across different cities - for example, Bogota's growth rate is much much lower than that of NYC's.
While policy makers tend to look at discrete pieces of infrastructure that come together in measuring city growth from the entrepreneurship perspective, the World Bank looks at the community as a whole. For instance, they take into account factors like the impact of community-building events (conferences), skill-building events and so on. Kathy also explained the difference between social media analysis, which involves looking at data from social media like Facebook or Twitter, and social network analysis, which looks at nodes (ie, organizations or start-ups) and edges (the relationships between those organizations).

The World Bank started off their research by looking at different types of centralities in social network analysis, such as degree centrality (how many people one is directly connected to), closeness centrality, and eigen-vector centrality (is the person that you're connected to very well-connected?).

One of the major problems that they face is that there is no good data source containing information on entrepreneurship. Developing countries in particular have a dearth of data. To overcome this, the World Bank first looks at cities that they are familiar with and that have a rich corpus of data (for example, New York City). (While Silicon Valley would be an obvious choice when it comes to entrepreneurship statistics, it is also unique in that it grew without much help or intervention from policy makers. So data from Silicon Valley isn't used in this study. New York, on the other hand, does have policy makers playing a huge role in its success, and was therefore a good choice.) Some stats that came to light - each founder starts about 1.12 companies, incubators have the highest eigen-vector centrality and start-ups the highest closeness centrality. The World Bank even ran regressions to extract data such as where start-ups are most likely to be founded. They were also able to show that you have a likelier chance of being funded if you already know someone else who received funding themselves (this way, you'll know the right people to get in touch with). This data is used in conjunction with data gathered from 12 cities around the world (Cairo, Bogota, Dar es Salaam, etc).

Other factors that the World Bank looks into: skills pipeline (are university students graduating with the right skill sets for the job market; do they tend to immigrate elsewhere after graduation, and so on), supporting infrastructure, such as what kind of support the government gives entrepreneurs (in Columbia, for instance, start-ups receive a ton of public funding, so that often they go from one program to another without actually having to make any money).

There are also several challenges that the team faces in using data for social good. For example, there's no thumb rule for analysis of data from different countries, even for something as simple as an address. Also, people tend to deeply distrust the government (in one instance, entrepreneurs refused to answer any World Bank surveys in the mistaken belief that they were going to be audited by the government!). Privacy laws also vary widely between countries, which makes anonymizing data that much more difficult. Plus, different cultural norms mean that the applicability of what is learned is often constrained. Agile testing and iteration are also next to impossible. Finally, there is often a data literacy gap between the various stakeholders.

Kathy believes that using data analytics to bring about social change is of prime importance, and invited those who share that belief to get in touch with her.

In summary, Kathy's presentation was both insightful and inspirational, and served to show how data analysis can be used for the greater good.

(This post was syndicated from

GHC16: The Digital Future: Defusing the Hype of IoT and Wearables

Speakers: Kiva Allgood, Esther Lekeu, Melissa Kreuzer, Sunny Webb, Siji Tom

The first indication that this was going to be a great session was the long line of ladies queued up half an hour in advance, waiting to get entry to the conference room - and our expectations were more than fulfilled. On the panel were five women all working in the IoT field, with different backgrounds - Kiva Allgood from Qualcomm, Melissa Kreuzer from Proctor and Gamble, Esther Lekeu from Meta, and Siji Tom from Apple, with Sunny Web from Accenture acting as moderator.

Sunny kicked off the discussion by asking whether IoT is a real thing, or if its still a myth. The panelists seemed to agree that while IoT is indeed a real thing, for a lot of people, its still a myth. Esther pointed out that while countries like the US and China are at the forefront of technology, other countries like Australia, where she comes from, don't really have a high level of adoption yet. Melissa concurred - while there are a lot of interconnected devices available in the market, there's also a wide range of adoption - those in developing countries have little or no access to a lot of the technology.

The panelists then went on to try and define IoT. Siji gave a base definition -  its essentially an internetwork of smart devices or devices that have sensors. Each of these devices can collect data that can essentially be used to make decisions that influence other devices. Kiva gave an example of how they used wireless sensors in an area in a developing country that had only 2 hours of drinking water supply, to collect data that allowed the supply to be increased to 6 hours. Melissa said that their company views IoT as a technology platform to utilize.

Sunny's next question was to ask each of the panelists to name her favorite smart device. Siji, who works on the Apple Watch team, had an obvious answer - the Apple Watch! :-) She loves being able to turn the lights off at home remotely, or to reply to messages from her watch. Esther thinks of her smartphone as an extension of her arm, particularly because it helps her stay connected with family and friends back in Australia. Kiva uses her smartphone to track her kids' location, and completely relies on it for connectivity with her family.

The next topic of debate was what the largest open challenge in IoT is. Siji felt that the biggest challenge her team faces right now is getting consumers to adopt the technology. Also, data privacy and security are major concerns for users - and this is something that there isn't a lot of regulation in yet. Kiva's point of view was that interoperability is a major concern for players in the IoT field, particularly given the large number of devices that are hitting the market. Business models need changing as well. Melissa pointed out that data is a commodity that's being monetized right now, but no company is as yet willing to share the information that they have. This is something that needs to happen in a transparent manner. Esther agreed, and also felt that companies need to adjust to the large number of open-source software packages being released. Siji mentioned that a tenet that her team at Apple holds strongly to is that data that's collected essentially belongs to the user.

The last question was what advice the panelists have for aspiring IoT engineers. Kiva believes that its important to come up with a good problem statement and to solve it diligently. She also thinks that one should be willing to take on new challenges. Esther suggested finding a mentor and trying to get visibility. She said that this was something she had to overcome, as for a long time, she was the only woman on her team, and found it hard to speak up. Melissa gave a great analogy - opposition is like antibodies; you know you;re leading change when people start pushing back. She likes to get "killer issues" (ie, critical assumptions) out early.

There was also some time set aside for questions from the audience, and there were several amazing ones. One audience member asked if there was a downside to IoT. Kiva and Melissa agreed that as parents, they are often worried about the social impact that constant use of devices has on their children. Esther mentioned that her group at Meta, which works on augmented reality, spends a lot of time with neurosurgeons, etc, to make sure that their product has no side effects like eye strain. Siji mentioned that there are some regulations in place to cover health aspects. Another question was on whether there are any industry standards around data privacy yet. While the panelists weren't aware of much going on in that area as yet, Siji mentioned that Apple recently put in place a policy called "differential policy" - data is anonymized, but still contains enough data to be useful.
All in all, this was a great session, with insights into a field that is still considered "emerging tech".

(This post was syndicated from

Wednesday, October 19, 2016

Wednesday Opening Keynotes - Virginia Rometty

Dr. Sweeney's inspiring talk (more here) was followed by the 2016 Technical Leadership ABIE Award presentation - to Dr. Anna Patterson, VP of Engineering, Artificial Intelligence at Google. Dr. Patterson's acceptance speech was full of amusing little anecdotes (for example, how she had to manually toggle bits to debug at her first job, working on planes - debugging and IDEs have come a long way since then!), but perhaps one of the most touching moments was when she paid tribute to her grandmothers and great-grandmother for being women leaders in their own right, showing her great-grandmother's poll tax receipts from voting.

Right after Dr. Patterson's speech was the Top Companies for Women Technologists award presentation. Top Companies is a program that was started by the Anita Borg institute that is the only data-driven benchmark for the technical workforce and shows which companies provide the most women-friendly work environments. This year, 60 companies participated in the program and are divided into one of two categories - change alliance and leadership index companies. The main things that distinguishes leadership index companies (ie, the best places for women to work at) from others are their flex time policies,  formal leadership development programs for women, and formal gender diversity training for all employees. Also, this year's findings reveal that women now hold about 21.7% of technical jobs, up 0.9% from 2015. This year's award went to ThoughtWorks.

The keynote ended with a talk by Virginia Rometty, CEO and President of IBM. She started off by talking about what she believes is the biggest natural resource of the present - data. Not only has 90% of all the data out in the world today been created just in the past 2 years, 80% of it is unstructured. Ms. Rometty believes that over the next 5 years, systems that learn using data are going to be increasingly important in the tech world. Given that, IBM is investing a great deal of its resources in the Watson systems, and already, has partnered with healthcare firms like Quest Diagnostics in an effort to find oncology patients the best treatment possible.

Ms. Rometty also shared several anecdotes from her own life that led her to where she is today. She explained that her mother was perhaps the single biggest influence in her life, inspiring by example. Ms. Rometty's mother was a single mother in the 60s, a difficult situation back then, and even went back to school while working on a night job in order to take care of Ms. Rometty and her siblings. She also mentioned that she derived huge support from her husband. Early on in her career, Ms. Rometty was offered a job opportunity that rather overwhelmed her, coming, as it did, "too early". She was inclined to walk away in the belief that she needed more time and experience to gain the necessary skills. It was her husband, however, who pointed out to her that this was something a man would never have felt had the same opportunity been offered him, and urged Ms. Rometty to take the job on. Since then, she has never looked back :-). The biggest take-away from that anecdote is that growth and comfort never co-exist. So even if you are attacked by the "imposter syndrome", which women frequently are, take big challenges on and learn from them. Another life-lesson that she shared was that it is important to work on something that you are passionate about, and that you believe is bigger than yourself.

Ms. Rometty ended the talk by inviting on stage three inspiring women from IBM, who have become leaders in their field and juggle multiple responsibilities Their advice to women - work on things that excite you, and always appreciate the people around you for everything they do.

GHC16: Wednesday Opening Keynotes - Dr Latanya Sweeney

GHC16 got off to a great start early Wednesday morning, with an impressive line-up of speakers for the opening keynote session. First off the bat was Telle Whitney, President of ABI. She remembered organizing the very first GHC in 1994 with Anita Borg. Back then, there were 500 attendees. GHC has grown by leaps and bounds in the intervening years, with nearly 15.000 women in tech attending the conference this year!

Next up was Dr. Latanya Sweeney, the founder and director of Harvard's Data Privacy Lab, and former Chief Technologist of the Federal Trade Commission. She was also the very first African-American woman to graduate with a PhD in Computer Science from MIT.

Dr. Sweeney's talk focused on how technology impacts humans and dictates our civic future. The environment in which computers operate has changed radically over the past few decades, so much so that it now encompasses the governance of daily life around the world. When the Sony camcorder was first introduced in 1983, American laws decreed that conversations couldn't be recorded without the person's consent. So while one now had the ability to record audio using the camcorder, it was against the grain of the law to do so. Fast forward to today, and the ability to record photos, video and audio is present in virtually every device.

We now live in a technocracy - tech design determines how we live our lives. Mattress sensors are capable of measuring our sleep patterns, and that data is sent to a third party to show us how well we slept at night. The Apple Watch captures and provides all kinds of data - how much you walked, when you need to exercise, and so on. All of this data is shared with your phone.

Dr. Sweeney explained how she first got into the field of data privacy. In grad school, she discovered that matching a person's medical records to their voter ID records was relatively easy. This was the beginning of work that led to the HIPAA regulations. Several years later, she found that Googling "Latanya" showed her more images of African-American women, while googling "Tanya" gave her images of Caucasian women. Given that Google's algorithm is supposed to be impartial, this just didn't seem right.

After her stint at the FTC, Dr. Sweeney started a course called "Data Science to Save the World" at Harvard. With the first batch of students on the course, she also started publication of the "Journal of Technology Science".  There were several interesting papers written by the students - for example, one student was able to come up with a method of detecting fraudulent websites using Twitter. Another was able to establish price discrimination in the Princeton Review's online SAT tutoring service - areas with high concentration of Asian families were likely to be charged higher.  A third student found that Facebook Messenger dropped a user's geo-location by default every time a message was sent, and this information was available to everyone. Eventually, Facebook released a patch to ensure that this information is shared only with user consent.

This year, Dr, Sweeney's group is looking into voting data - at the moment, its surprisingly hard to find out where one needs to go to vote. They've even been able to find erroneous data given out on some NY and NC voting locations.

Dr. Sweeney firmly believes that given the vast amount of data available to us today, each of us can save the world.

GHC16 Plenary Session: Astro Teller, Captain of Moonshots at X

Dr. Astro Teller, Captain of Moonshots (CEO) at X

Dr. Teller's talk focused mostly on the mantras followed at X to foster innovation. He began with an anecdote illustrating how his management style has developed over the years to what it is today, and how that impacts X's culture. It started with a photograph of Dr Teller with his two famous grandfathers, one a Nobel-Prize-winning economist (Gerard Debreu) and the other the famous theoretical physicist and founder of the Lawrence Livermore Labs, Edward Teller. He spoke about how, given his illustrious family background, he always had the feeling that he was the "dumb one" in the family, and how that spurred him on to work harder and study harder than anyone else. This was what led him to eventually graduate with a PhD in Artificial Intelligence. Soon after graduation, Dr. Teller started a business, and as CEO, felt that he always had to play speed chess with his employees to be an effective manager. Six months down the line, he realized that that wasn't actually the case, and the best way to lead was to amplify his employees' skills. This is what he has dedicated the past 20 years to, and at X, although his title is "Captain of Moonshots", he spends all his time on people and culture, rather than hands-on involvement in tech work.

Dr. Teller then went on to define what a moonshot really is. At X, anything is considered to be a moonshot if its a huge problem that needs solving, if there's a radical solution that's being proposed to fix that problem, and finally, if that solution involves breakthrough technology. Some of X's recent "moonshot" projects are Google Brain, self-driving cars, and Google Loon.

The first of X's mantras to more effective innovation is to fall in love with the problem being solved, rather than the technology. For instance, saying that one wants to work on machine learning is akin to saying that one wants to build something that has transistors in it. Dr. Teller gave the example of a recent X project, headed by Kathy Cooper, which tried converting sea-water to methanol in a carbon-neutral way, as an alternative to gasoline for the approximately 4 billion internal combustion engines in use today. While the team was trying to figure out ways to effectively make the conversion, they were also thinking about alternative solutions if their original plan didn't work. Although they were  eventually successful in making methanol from sea-water, the associated cost was too high, about $15 a gallon. The project therefore had to be killed, but still garnered Kathy and her team a lot of recognition at X and beyond. The main take-away from that story is that its OK to fail - if people aren't given credit for all the hard work they've put in on failed projects, then no one would ever have the courage to walk away from ideas that didn't work out. At the same time, sharing Kathy's work with the world means that other folks who want to solve the same problem have a body of work to look back at. X even makes it a point to celebrate all projects that have failed on the Day of the Dead.

Another mantra followed at X is "inspiring innovation". When people are inspired by the problems that they are trying to solve, they bring more of themselves to work, and are also encouraged to think differently. Astro Teller mentioned the famous "mutilated checkboard" problem as an illustration - simply putting the same problem in different ways causes people to shift their perspective on how to solve it. This is known as perspective shifting, and is deeply ingrained in X's culture.

A third mantra at X is "T-shaped people". X doesn't necessarily believe in hiring persons only with a suitable background for a given position. On the contrary, at X, people from different fields are welcomed in the belief that they bring fresh perspectives to the table. As an example, Astro Teller spoke of a fashion designer who was brought on board the Loon team to design and perform quality assurance tests on the balloons, ensuring that they survive in any kind of environmental conditions.

The fourth mantra - get in touch with the physical world as quickly as possible. This puts projects out into the real world early on, increasing the chances of discovering bugs and problems. For instance, recently, a self-driving car was out on a test drive in Mountain View, when it came across an old lady in a wheelchair, wielding a broom and trying to get a duck off the street. This simply wasn't a situation that the self-driving car project team had thought up unit tests for :-).  The car passed that little test successfully, waiting until the old lady had crossed the street. Similarly, there are countless other similar situations that would be difficult to account for until the project is out in the real world. Sometimes, this strategy could backfire, causing the project to be killed. But the thing to note is that if one couches failure as failure, then people are going to be wary of taking new projects on. However, if things are turned around and failures are, instead, viewed as learning opportunities, then people become much more willing to experiment, even if they fail. This is a culture that managers need to foster.

The fifth mantra is that we need to balance both audacity and vulnerability. A mix of both is needed to make sure that we innovate effectively. While audacity is what drives us to try new things, vulnerability and humility are required to admit when we go wrong, and to learn new things.  X's culture is to help people to achieve this balance.

(This post was syndicated from

Monday, October 10, 2016


I'm attending the Grace Hopper Conference this year, and I'm pretty excited!

I'm going to be blogging about sessions I attend at GHC. Stay tuned!