Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 10 of 12 for Chaumond (0.2 sec)

  1. model_cards/distilbert-base-multilingual-cased-README.md

    ### BibTeX entry and citation info
    
    ```bibtex
    @article{Sanh2019DistilBERTAD,
      title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
      author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
      journal={ArXiv},
      year={2019},
      volume={abs/1910.01108}
    }
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-11-04 16:41
    - 1.7K bytes
    - Viewed (0)
  2. model_cards/mrm8488/electricidad-base-generator/README.md

    ```
    
    ## Acknowledgments
    
    I thank [🤗/transformers team](https://github.com/huggingface/transformers) for allowing me to train the model (specially to [Julien Chaumond](https://twitter.com/julien_c)).
    
    
    
    > Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488)
    
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-09-18 07:24
    - 2.1K bytes
    - Viewed (0)
  3. model_cards/distilbert-base-cased-README.md

    ### BibTeX entry and citation info
    
    ```bibtex
    @article{Sanh2019DistilBERTAD,
      title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
      author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
      journal={ArXiv},
      year={2019},
      volume={abs/1910.01108}
    }
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-11-04 16:41
    - 1.3K bytes
    - Viewed (0)
  4. model_cards/distilroberta-base-README.md

    ### BibTeX entry and citation info
    
    ```bibtex
    @article{Sanh2019DistilBERTAD,
      title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
      author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
      journal={ArXiv},
      year={2019},
      volume={abs/1910.01108}
    }
    ```
    
    <a href="https://huggingface.co/exbert/?model=distilroberta-base">
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-11-04 16:41
    - 1.8K bytes
    - Viewed (0)
  5. model_cards/mrm8488/electricidad-base-discriminator/README.md

    
    
    ## Acknowledgments
    
    I thank [🤗/transformers team](https://github.com/huggingface/transformers) for allowing me to train the model (specially to [Julien Chaumond](https://twitter.com/julien_c)).
    
    
    
    > Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488)
    
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-09-28 17:40
    - 3.3K bytes
    - Viewed (0)
  6. model_cards/distilbert-base-uncased-README.md

    ### BibTeX entry and citation info
    
    ```bibtex
    @article{Sanh2019DistilBERTAD,
      title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
      author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
      journal={ArXiv},
      year={2019},
      volume={abs/1910.01108}
    }
    ```
    
    <a href="https://huggingface.co/exbert/?model=distilbert-base-uncased">
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-11-04 16:41
    - 8.3K bytes
    - Viewed (0)
  7. setup.py

        deps["tqdm"],       # progress bars in model download and training scripts
    ]
    
    setup(
        name="transformers",
        version="4.0.0-rc-1",
        author="Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Sam Shleifer, Patrick von Platen, Sylvain Gugger, Google AI Language Team Authors, Open AI team Authors, Facebook AI Authors, Carnegie Mellon University Authors",
        author_email="******@****.***",
    Python
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-11-27 17:25
    - 9.2K bytes
    - Viewed (0)
  8. examples/distillation/README.md

    ```
    @inproceedings{sanh2019distilbert,
      title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
      author={Sanh, Victor and Debut, Lysandre and Chaumond, Julien and Wolf, Thomas},
      booktitle={NeurIPS EMC^2 Workshop},
      year={2019}
    }
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-11-22 03:58
    - 14.6K bytes
    - Viewed (0)
  9. README.md

    ```bibtex
    @inproceedings{wolf-etal-2020-transformers,
        title = "Transformers: State-of-the-Art Natural Language Processing",
    Plain Text
    - Registered: 2020-11-29 10:36
    - Last Modified: 2020-11-27 17:31
    - 27K bytes
    - Viewed (0)
  10. doc/testimonials/testimonials.rst

    models in production and they are also operationally very pleasant to work with.
    
    .. raw:: html
    
       <span class="testimonial-author">
    
    Julien Chaumond, Chief Technology Officer
    
    .. raw:: html
    
       </span>
       </div>
       <div class="sk-testimonial-div-box">
    
    .. image:: images/huggingface.png
        :width: 120pt
        :align: center
    Plain Text
    - Registered: 2020-10-16 09:24
    - Last Modified: 2019-10-15 20:02
    - 28.9K bytes
    - Viewed (0)
Back to top