About 1,190,000 results
Open links in new tab
  1. Cross-entropy - Wikipedia

    Logistic regression typically optimizes the log loss for all the observations on which it is trained, which is the same as optimizing the average cross-entropy in the sample.

  2. What Is Cross-Entropy Loss Function? - GeeksforGeeks

    Aug 1, 2025 · Cross-entropy loss is a way to measure how close a model’s predictions are to the correct answers in classification problems. It helps train models to make more confident and accurate …

  3. CrossEntropyLoss — PyTorch 2.9 documentation

    Specifies the amount of smoothing when computing the loss, where 0.0 means no smoothing. The targets become a mixture of the original ground truth and a uniform distribution as described in …

  4. Cross-Entropy Loss Function in Machine Learning: Enhancing Model ...

    Aug 10, 2024 · Cross-entropy is a popular loss function used in machine learning to measure the performance of a classification model. Namely, it measures the difference between the discovered …

  5. A Simple Introduction to Cross Entropy Loss

    The Cross Entropy Loss is a standard evaluation function in machine learning, used to assess model performance for classification problems. This article will cover how Cross Entropy is calculated, and …

  6. A Brief Overview of Cross Entropy Loss - Medium

    Sep 25, 2024 · Cross entropy loss is a mechanism to quantify how well a model’s predictions match the actual outcomes, rewarding the model for assigning higher probabilities to correct answers.

  7. Loss Functions — ML Glossary documentation - Read the Docs

    In binary classification, where the number of classes \ (M\) equals 2, cross-entropy can be calculated as: If \ (M > 2\) (i.e. multiclass classification), we calculate a separate loss for each class label per …

  8. Cross-Entropy Loss: Simple Explanations, Maths Explained

    May 3, 2025 · In this simple scenario, you've just implemented a rudimentary "loss function" - the feedback mechanism that powers machine learning. From facial recognition to language translation, …

  9. Loss Functions for Classification: Cross-Entropy Explained - LinkedIn

    Dec 20, 2025 · Understanding how different cross-entropy losses work gives you the power to fix training dynamics, make smarter modeling choices, and boost real-world performance.

  10. Cross Entropy Loss | Machine Learning Theory

    What loss functions make the multivariate regression rule μ (x) μ(x) ideal for the one-encoded target?