Heiko Hotz
Dec 10, 2021

Hi Dhruv - agreed, waiting 15 seconds until your lights turn on is not an ideal scenario, so maybe that wasn't the best example ;D

I only have started playing with the new serverless inference, so I'm not sure yet how we can optimise the setup to reduce model loading time. Once I find more answers I will be happy to share my findings :)

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Heiko Hotz
Heiko Hotz

Written by Heiko Hotz

Generative AI Blackbelt @ Google — All opinions are my own

No responses yet

Write a response