Search Options

Results per page
Sort
Preferred Languages
Advance

Results 1 - 10 of 18 for Pedregosa (0.19 sec)

  1. .mailmap

    Emmanuelle Gouillart <******@****.***>
    Emmanuelle Gouillart <******@****.***> <emma@aleph.(none)>
    Eustache Diemert <******@****.***>
    Fabian Pedregosa <fabian.pedregosa@inria.fr>
    Fabian Pedregosa <fabian.pedregosa@inria.fr> <******@****.***>
    Fabian Pedregosa <fabian.pedregosa@inria.fr> <******@****.***>
    Federico Vaggi <******@****.***>
    Federico Vaggi <******@****.***> <******@****.***>
    Plain Text
    - Registered: 2020-10-16 09:24
    - Last Modified: 2017-08-24 11:24
    - 7.1K bytes
    - Viewed (0)
  2. doc/whats_new/older_versions.rst

      :func:`linear_model.lars_path` [`Alexandre Gramfort`_ and `Fabian
      Pedregosa`_].
    
    - Fixes for liblinear ordering of labels and sign of coefficients
      [Dan Yamins, Paolo Losi, `Mathieu Blondel`_ and `Fabian Pedregosa`_].
    
    - Performance improvements for Nearest Neighbors algorithm in
      high-dimensional spaces [`Fabian Pedregosa`_].
    
    - Performance improvements for :class:`~cluster.KMeans` [`Gael
    Plain Text
    - Registered: 2020-10-16 09:24
    - Last Modified: 2020-10-07 03:45
    - 44.2K bytes
    - Viewed (0)
  3. examples/neighbors/plot_regression.py

    using a k-Nearest Neighbor and the interpolation of the
    target using both barycenter and constant weights.
    
    """
    print(__doc__)
    
    # Author: Alexandre Gramfort <******@****.***>
    #         Fabian Pedregosa <fabian.pedregosa@inria.fr>
    #
    # License: BSD 3 clause (C) INRIA
    
    
    # #############################################################################
    # Generate sample data
    import numpy as np
    import matplotlib.pyplot as plt
    Python
    - Registered: 2020-10-16 09:24
    - Last Modified: 2019-06-14 15:38
    - 1.4K bytes
    - Viewed (0)
  4. examples/linear_model/plot_lasso_lars.py

    algorithm on the diabetes dataset. Each color represents a different
    feature of the coefficient vector, and this is displayed as a function
    of the regularization parameter.
    
    """
    print(__doc__)
    
    # Author: Fabian Pedregosa <fabian.pedregosa@inria.fr>
    #         Alexandre Gramfort <******@****.***>
    # License: BSD 3 clause
    
    import numpy as np
    import matplotlib.pyplot as plt
    
    from sklearn import linear_model
    Python
    - Registered: 2020-10-16 09:24
    - Last Modified: 2019-08-25 03:17
    - 1K bytes
    - Viewed (0)
  5. examples/manifold/plot_swissroll.py

    """
    ===================================
    Swiss Roll reduction with LLE
    ===================================
    
    An illustration of Swiss Roll reduction
    with locally linear embedding
    """
    
    # Author: Fabian Pedregosa -- <fabian.pedregosa@inria.fr>
    # License: BSD 3 clause (C) INRIA 2011
    
    print(__doc__)
    
    import matplotlib.pyplot as plt
    
    # This import is needed to modify the way figure behaves
    from mpl_toolkits.mplot3d import Axes3D
    Axes3D
    Python
    - Registered: 2020-10-16 09:24
    - Last Modified: 2019-10-27 21:17
    - 1.2K bytes
    - Viewed (0)
  6. examples/linear_model/plot_ridge_path.py

    exhibit big oscillations. In practise it is necessary to tune alpha
    in such a way that a balance is maintained between both.
    """
    
    # Author: Fabian Pedregosa -- <fabian.pedregosa@inria.fr>
    # License: BSD 3 clause
    
    print(__doc__)
    
    import numpy as np
    import matplotlib.pyplot as plt
    from sklearn import linear_model
    
    # X is the 10x10 Hilbert matrix
    Python
    - Registered: 2020-10-16 09:24
    - Last Modified: 2017-06-20 12:48
    - 2.1K bytes
    - Viewed (0)
  7. doc/authors_emeritus.rst

    - Brian Holt
    - Arnaud Joly
    - Thouis (Ray) Jones
    - Kyle Kastner
    - manoj kumar
    - Robert Layton
    - Wei Li
    - Paolo Losi
    - Gilles Louppe
    - Vincent Michel
    - Jarrod Millman
    - Alexandre Passos
    - Fabian Pedregosa
    - Peter Prettenhofer
    - (Venkat) Raghav, Rajagopalan
    - Jacob Schreiber
    - Jake Vanderplas
    - David Warde-Farley
    Plain Text
    - Registered: 2020-10-16 09:24
    - Last Modified: 2019-10-02 19:14
    - 573 bytes
    - Viewed (0)
  8. doc/about.rst

    citations to the following paper:
    
      `Scikit-learn: Machine Learning in Python
      <http://jmlr.csail.mit.edu/papers/v12/pedregosa11a.html>`_, Pedregosa
      *et al.*, JMLR 12, pp. 2825-2830, 2011.
    
      Bibtex entry::
    
        @article{scikit-learn,
         title={Scikit-learn: Machine Learning in {P}ython},
         author={Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V.
                 and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P.
    Plain Text
    - Registered: 2020-10-16 09:24
    - Last Modified: 2020-08-11 16:26
    - 14.2K bytes
    - Viewed (0)
  9. doc/whats_new/v0.13.rst

      eigenmaps" transformation for non-linear dimensionality reduction by Wei
      Li. See :ref:`spectral_embedding` in the user guide.
    
    - :class:`isotonic.IsotonicRegression` by `Fabian Pedregosa`_, `Alexandre Gramfort`_
      and `Nelle Varoquaux`_,
    
    
    Changelog
    ---------
    
    - :func:`metrics.zero_one_loss` (formerly ``metrics.zero_one``) now has
    Plain Text
    - Registered: 2020-10-16 09:24
    - Last Modified: 2019-08-02 01:54
    - 14K bytes
    - Viewed (0)
  10. examples/manifold/plot_lle_digits.py

    module, are supervised dimensionality reduction method, i.e. they make use of
    the provided labels, contrary to other methods.
    """
    
    # Authors: Fabian Pedregosa <fabian.pedregosa@inria.fr>
    #          Olivier Grisel <******@****.***>
    #          Mathieu Blondel <******@****.***>
    #          Gael Varoquaux
    # License: BSD 3 clause (C) INRIA 2011
    
    from time import time
    Python
    - Registered: 2020-10-16 09:24
    - Last Modified: 2020-05-18 16:35
    - 9.2K bytes
    - Viewed (0)
Back to top