There are two inevitable truths in machine learning: more data and more computational power to train on this data.

Photo by Melyna Valle on Unsplash

In a recent article I covered a basic architecture covering compute and storage for ML. One of the main assumptions in the architecture I presented was in having all training data consolidated in a single location (storage system) and more critically, that this data is readily available for ML training. What happens then, if we need some data for training, but for a variety of reasons, that data cannot be stored in our environment? Enter Federated Learning.

The approach that I outlined in my article assumed that both data and compute are centralized. Under a centralized approach, all the data…

There are a few important constraints to keep in mind in regards to GPUs.

Photo by Vitor Pinto on Unsplash

Over the past two articles we covered the various activities involved with data collection and storage. These were part of a 3-step process that is outlined below. We now reach the final step, which is concerned with using the data we collected and stored.

Will DeepTech be the spark that transforms how we live?

Photo by Ned Daniels on Unsplash

My entire career has been in software. I joined Microsoft, fresh out of school, in October 2000 and have been working either as a software developer or more recently in engineering leadership since then. The tech that I experienced is the one that typically starts in a garage, where one or more software developers are furiously coding up an MVP in their garage. The tech that uses familiar tools like Jira, GitHub and leasing infrastructure by the minute on AWS. It’s the “tech” that yielded the likes of Facebook, Amazon, LinkedIn, Google, Microsoft Teams — a product I actually worked…

Photo by Sajad Nori on Unsplash

In a previous article, I laid out a simple framework to navigate the various challenges involved with using data to build machine learning (ML) models. This framework is illustrated in the diagram below.

How strong are your data jujitsu skills?

Photo by Josefina Di Battista on Unsplash

A few weeks ago I hosted a series of roundtable discussions with a group of engineering leaders from Atomico’s portfolio companies. The theme of these discussions was on data, specifically the challenges of data in an AI company. The topic is admittedly broad and open ended, but it is an important one nonetheless.

A significant portion of building AI, or to be more specific machine learning (ML) models, is centered around data. ML models are mostly concerned with getting data, cleaning it, transforming it, visualizing it and finally using it to build models. A recent survey by Anaconda revealed that…

It’s hard, complex, multidisciplinary yet exceptionally rewarding

Photo by Hush Naidoo on Unsplash

My professional software experience has for the most part been in the B2B sector, working products spanning video-conferencing, distributed file systems and SQL engines. My most recent experience at Kheiron is my first foray into AI and healthcare. In a previous article, I covered the differences between AI products and software ones. This article is concerned with the differences I observed in developing software for the healthcare sector.

First a bit of context about the sector, mostly driven from my own impressions with empirical evidence when available.

The healthcare market is massive. In the US alone, this market is estimated…

Most AI products are primarily concerned with making inferencing decisions. An AI model receives some input which will run through the model resulting in some decision, oftentimes called a prediction or inference. Depending on the nature of the AI model, the precision of the decisions it makes can have profound effects on its usage and ultimate success.

Consider a recommendation engine, perhaps one that recommends what movies to watch based on your viewing history. The AI model in this case is making a predictive decision based on your own viewing history. We’re all fairly accustomed to these recommendation engines on…

A case study of

I have been waiting for a pure-play artificial intelligence (AI) company to file for an IPO to get an in depth look at the company’s business model and how varied it is to traditional software products. My interest lies in trying to understand the impact of the long tail of AI on the company’s business model and financials. I had written about this topic earlier in a couple of previous posts. The first offered a glimpse on how building AI products is fundamentally different than traditional software. …

Image source: Annie Sprat

Equity is a significant portion of a startup’s compensation package, in fact it is arguably one of the top reasons why people join startups. In spite of the popularity and importance of equity, I’ve seen that the way it is used can vary quite significantly between companies. While I do not posit that there is a right approach to equity compensation, I do believe that there are certain outcomes that should be universally applicable. These are outlined below.

Equity >> Cash

This one should be non-controversial. As an employee joining a startup, I want to participate in the potential economic growth of the…

If there is a single unifying factor that I have found across all startups, both ones I worked at or advised, it is the following: There will always be more work than the resources available to do it. Always.

There are two consequences to the above. The first, is being ruthless with prioritizing work. You have to pick your bets very carefully and where you choose to invest your scarce resources — a topic for another post. The second, is the importance of excelling in recruiting. …

Karim Fanous

Tech leadership at various early stage startups: Qumulo, Dremio and now Kheiron Medical

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store