Generative and Discriminative Voxel Modeling

Generative and Discriminative Voxel Modeling
Slide Note
Embed
Share

This presentation delves into generative and discriminative voxel modeling, exploring the key aspects of representation choice along with background on VoxNet and VAEs. It covers topics such as VAE architecture, reconstruction objectives, error surfaces, classification trends, and the importance of deep neural networks. Through various images, it illustrates reconstruction results, samples, interpolations, interfaces, and more in this advanced computational field.

  • Voxel Modeling
  • Generative
  • Discriminative
  • Neural Networks
  • Image Processing

Uploaded on Mar 15, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Generative and Discriminative Voxel Modeling Andrew Brock

  2. Introduction Choice of representation is key!

  3. Background: VoxNet Maturana et al. 2015

  4. Background: VAEs

  5. Background: VAEs

  6. VAE Architecture

  7. Reconstruction Objective Standard Binary Cross-Entropy Modified Binary Cross-Entropy

  8. Error Surface

  9. Reconstruction Objective

  10. Reconstruction Results

  11. Samples and Interpolations

  12. Interface

  13. Classification: Prior Art

  14. Classification: Low-Hanging Fruit All previous works only considered relatively shallow volumetric ConvNets (or non-volumetric ConvNets).

  15. Classification: Low-Hanging Fruit All previous works only considered relatively shallow volumetric ConvNets (or non-volumetric ConvNets). Utterly unsurprisingly, deeper nets perform much better.

  16. Classification: Low-Hanging Fruit All previous works only considered relatively shallow volumetric ConvNets (or non-volumetric ConvNets). Utterly unsurprisingly, deeper nets perform much better. But, that doesn t mean we have to be na ve!

  17. Voxception

  18. Voxception-ResNet

  19. Voxception-ResNet

  20. Data and Training -Use ELUs, Batch-Norm, and pre-activation -Change the binary voxel range to {-1,5} to encourage the network to pay more attention to positive entries (and to improve its ability to learn about negative entries) -Warm up on 12 rotated-instance set (12 epochs) then anneal fine-tune on 24 rotated-instances.

  21. Orthogonal Regularization Initializing weights with orthogonal matrices works well so why not keep them orthogonal?

  22. Results

  23. Results but don t pay too much attention to the numbers

  24. Thanks!

Related


More Related Content