ALTERNATE UNIVERSE DEV

Podcast.__init__

Illustrating The Landscape And Applications Of Deep Learning

Summary

Deep learning is a phrase that is used more often as it continues to transform the standard approach to artificial intelligence and machine learning projects. Despite its ubiquity, it is often difficult to get a firm understanding of how it works and how it can be applied to a particular problem. In this episode Jon Krohn, author of Deep Learning Illustrated, shares the general concepts and useful applications of this technique, as well as sharing some of his practical experience in using it for his work. This is definitely a helpful episode for getting a better comprehension of the field of deep learning and when to reach for it in your own projects.

Announcements

  • Hello and welcome to Podcast.__init__, the podcast about Python and the people who make it great.
  • When you’re ready to launch your next app or want to try a project you hear about on the show, you’ll need somewhere to deploy it, so take a look at our friends over at Linode. With 200 Gbit/s private networking, scalable shared block storage, node balancers, and a 40 Gbit/s public network, all controlled by a brand new API you’ve got everything you need to scale up. And for your tasks that need fast computation, such as training machine learning models, they just launched dedicated CPU instances. Go to pythonpodcast.com/linode to get a $20 credit and launch a new server in under a minute. And don’t forget to thank them for their continued support of this show!
  • You listen to this show to learn and stay up to date with the ways that Python is being used, including the latest in machine learning and data analysis. For even more opportunities to meet, listen, and learn from your peers you don’t want to miss out on this year’s conference season. We have partnered with organizations such as O’Reilly Media, Dataversity, Corinium Global Intelligence, Alluxio, and Data Council. Upcoming events include the combined events of the Data Architecture Summit and Graphorum, the Data Orchestration Summit, and Data Council in NYC. Go to pythonpodcast.com/conferences to learn more about these and other events, and take advantage of our partner discounts to save money when you register today.
  • Your host as usual is Tobias Macey and today I’m interviewing Jon Krohn about his recent book, deep learning illustrated

Interview

  • Introductions
  • How did you get introduced to Python?
  • Can you start by giving a brief description of what we’re talking about when we say deep learning and how you got involved with the field?
    • How does your background in neuroscience factor into your work on designing and building deep learning models?
  • What are some of the ways that you leverage deep learning techniques in your work?
  • What was your motivation for writing a book on the subject?
    • How did the idea of including illustrations come about and what benefit do they provide as compared to other books on this topic?
  • While planning the contents of the book what was your thought process for determining the appropriate level of depth to cover?
    • How would you characterize the target audience and what level of familiarity and proficiency in employing deep learning do you wish them to have at the end of the book?
  • How did you determine what to include and what to leave out of the book?
    • The sequencing of the book follows a useful progression from general background to specific uses and problem domains. What were some of the biggest challenges in determining which domains to highlight and how deep in each subtopic to go?
  • Because of the continually evolving nature of the field of deep learning and the associated tools, how have you guarded against obsolescence in the content and structure of the book?
    • Which libraries did you focus on for your examples and what was your selection process?
      • Now that it is published, is there anything that you would have done differently?
  • One of the critiques of deep learning is that the models are generally single purpose. How much flexibility and code reuse is possible when trying to repurpose one model pipeline for a slightly different dataset or use case?
    • I understand that deployment and maintenance of models in production environments is also difficult. What has been your experience in that regard, and what recommendations do you have for practitioners to reduce their complexity?
  • What is involved in actually creating and using a deep learning model?
    • Can you go over the different types of neurons and the decision making that is required when selecting the network topology?
  • In terms of the actual development process, what are some useful practices for organizing the code and data that goes into a model, given the need for iterative experimentation to achieve desired levels of accuracy?
  • What is your personal workflow when building and testing a new model for a new use case?
  • What are some of the limitations of deep learning and cases where you would recommend against using it?
  • What are you most excited for in the field of deep learning and its applications?
    • What are you most concerned by?
  • Do you have any parting words or closing advice for listeners and potential readers?

Keep In Touch

Picks

Closing Announcements

  • Thank you for listening! Don’t forget to check out our other show, the Data Engineering Podcast for the latest on modern data management.
  • Visit the site to subscribe to the show, sign up for the mailing list, and read the show notes.
  • If you’ve learned something or tried out a project from the show then tell us about it! Email hosts@podcastinit.com) with your story.
  • To help other people find the show please leave a review on iTunes and tell your friends and co-workers
  • Join the community in the new Zulip chat workspace at pythonpodcast.com/chat

Links

The intro and outro music is from Requiem for a Fish The Freak Fandango Orchestra / CC BY-SA

Episode source