6 MONTHS AGO • 2 MIN READ

LLMs, Edge compute and 2 smoking SR&ED Stats

profile

Product for Founders

Practical tips to maximize ROI on SR&ED, R&D, technical strategy, infrastructure, and practical founder challenges - especially in the AI/ML space. Under 5 mins, 2x month.

Hi Reader,

Welcome to this week’s edition of Product for Founders, a newsletter for tech and $ savvy founders! We focus on AI must-knows for solid product decisions and the Canadian SR&ED program.

Have you ever thought about deploying an LLM without cloud access?

Think:

  • Internet-less smart assistants
  • On-device analytics for satellite applications
  • Autonomous robots
  • Low energy devices (like raspberry pi)
  • Intelligent medical devices in the human body

For the curious, the FDA published its latest list of 950 approved AI/ML enabled devices in August.

The critical need for a product in these usecases will be: low compute use, low latency solutions.

Compute usage is affected by so many factors, that a 1-liner does not suffice.

But, a model e.g.,

Why does this matter? Because the more ‘intense’ the compute and memory requirements to do inference are, the more powerful the thing you are deploying to needs to be.

Can you imagine an Nvidia A100 chugging along in an implanted medical device?

Probably not. These power hungry monsters are not suitable candidates for on-edge and on-device LLM applications.

Which brings me to Mistral’s latest announcement of 2 new edge efficient models.

Impressive benchmark performance.

It’s early days - so we are yet to see reports of its usage in the wild.

But efforts in this direction will be key to making LLM usage widely available for low power, often hazardous applications.

Bottom Line

When you’re building Product, think deeply about your customer and end goal.

Work backwards from that to determine technology choice, model choice and tech stack.

Refactoring is always painful - do what you can upfront to avoid it.

Sidebar: Afaik, there is no systematic measurement of memory usage and energy consumption across models, but DM me if you’re interested. I found some interesting directional content while researching this article.


Switching gears a bit here -

Are you a Canadian entrepreneur leading a tech startup? Keep reading.

For everyone else - share this with your Canadian founder friends 😀

Here are a few impressive stats that everyone should know about the support for R&D in Canada.

The single largest federal program to support R&D in businesses in Canada

SR&ED

You might be small - but you’re a mighty powerhouse of innovation

Bottom Line

How are LLMs for edge computing and SR&ED related?

If you’re doing one - you should be thinking about applying for the second.


Want to chat some more on either topic - LLMs for edge computing or SR&ED?

Reply to this email - I'm always happy to chat!

That's a wrap! Stay curious & keep innovating.

Let's build together,

Varsha


Thanks for reading! If you loved it, forward this email to your friends and colleagues.

Unsubscribe or Preferences.​

© Inference PM

113 Cherry St #92768, Seattle, WA 98104-2205

Product for Founders

Practical tips to maximize ROI on SR&ED, R&D, technical strategy, infrastructure, and practical founder challenges - especially in the AI/ML space. Under 5 mins, 2x month.