Loss functions

Work-in-progress

This page of the docs is still a work-in-progress. Check back later!

GCPDecompositions.AbstractLossType
AbstractLoss

Abstract type for GCP loss functions $f(x,m)$, where $x$ is the data entry and $m$ is the model entry.

Concrete types ConcreteLoss <: AbstractLoss should implement:

  • value(loss::ConcreteLoss, x, m) that computes the value of the loss function $f(x,m)$
  • deriv(loss::ConcreteLoss, x, m) that computes the value of the partial derivative $\partial_m f(x,m)$ with respect to $m$
  • domain(loss::ConcreteLoss) that returns an Interval from IntervalSets.jl defining the domain for $m$
source
GCPDecompositions.valueFunction
value(loss, x, m)

Compute the value of the (entrywise) loss function loss for data entry x and model entry m.

source
GCPDecompositions.derivFunction
deriv(loss, x, m)

Compute the derivative of the (entrywise) loss function loss at the model entry m for the data entry x.

source
GCPDecompositions.BernoulliLogitLossType
BernoulliLogitLoss(eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Bernouli data X with log odds-success rate given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{Bernouli}(\rho_i)$
  • Link function: $m_i = \log(\frac{\rho_i}{1 - \rho_i})$
  • Loss function: $f(x, m) = \log(1 + e^m) - xm$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.BernoulliOddsLossType
BernoulliOddsLoss(eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Bernouli data X with odds-sucess rate given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{Bernouli}(\rho_i)$
  • Link function: $m_i = \frac{\rho_i}{1 - \rho_i}$
  • Loss function: $f(x, m) = \log(m + 1) - x\log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.BetaDivergenceLossType
BetaDivergenceLoss(β::Real, eps::Real)

BetaDivergence Loss for given β

  • Loss function: $f(x, m; β) = \frac{1}{\beta}m^{\beta} - \frac{1}{\beta - 1}xm^{\beta - 1} if \beta \in \mathbb{R} \{0, 1\}, m - x\log(m) if \beta = 1, \frac{x}{m} + \log(m) if \beta = 0$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.CustomLossType
CustomLoss

Type for user-defined loss functions $f(x,m)$, where $x$ is the data entry and $m$ is the model entry.

Contains the following fields:

  • func::Function : function that evaluates the loss function $f(x,m)$
  • deriv::Function : function that evaluates the partial derivative $\partial_m f(x,m)$ with respect to $m$
  • domain::Interval : Interval from IntervalSets.jl defining the domain for $m$

The constructor is CustomLoss(func; deriv, domain). If not provided,

  • deriv is automatically computed from func using forward-mode automatic differentiation
  • domain gets a default value of Interval(-Inf, +Inf)
source
GCPDecompositions.GammaLossType
GammaLoss(eps::Real = 1e-10)

Loss corresponding to a statistical assumption of Gamma-distributed data X with scale given by the low-rank model tensor M.

  • Distribution: $x_i \sim \operatorname{Gamma}(k, \sigma_i)$
  • Link function: $m_i = k \sigma_i$
  • Loss function: $f(x,m) = \frac{x}{m + \epsilon} + \log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.HuberLossType
HuberLoss(Δ::Real)

Huber Loss for given Δ

  • Loss function: $f(x, m) = (x - m)^2 if \abs(x - m) \leq \Delta, 2\Delta\abs(x - m) - \Delta^2 otherwise$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.LeastSquaresLossType
LeastSquaresLoss()

Loss corresponding to conventional CP decomposition. Corresponds to a statistical assumption of Gaussian data X with mean given by the low-rank model tensor M.

  • Distribution: $x_i \sim \mathcal{N}(\mu_i, \sigma)$
  • Link function: $m_i = \mu_i$
  • Loss function: $f(x,m) = (x-m)^2$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.NegativeBinomialOddsLossType
NegativeBinomialOddsLoss(r::Integer, eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Negative Binomial data X with log odds failure rate given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{NegativeBinomial}(r, \rho_i)$
  • Link function: $m = \frac{\rho}{1 - \rho}$
  • Loss function: $f(x, m) = (r + x) \log(1 + m) - x\log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.NonnegativeLeastSquaresLossType
NonnegativeLeastSquaresLoss()

Loss corresponding to nonnegative CP decomposition. Corresponds to a statistical assumption of Gaussian data X with nonnegative mean given by the low-rank model tensor M.

  • Distribution: $x_i \sim \mathcal{N}(\mu_i, \sigma)$
  • Link function: $m_i = \mu_i$
  • Loss function: $f(x,m) = (x-m)^2$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.PoissonLogLossType
PoissonLogLoss()

Loss corresponding to a statistical assumption of Poisson data X with log-rate given by the low-rank model tensor M.

  • Distribution: $x_i \sim \operatorname{Poisson}(\lambda_i)$
  • Link function: $m_i = \log \lambda_i$
  • Loss function: $f(x,m) = e^m - x m$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.PoissonLossType
PoissonLoss(eps::Real = 1e-10)

Loss corresponding to a statistical assumption of Poisson data X with rate given by the low-rank model tensor M.

  • Distribution: $x_i \sim \operatorname{Poisson}(\lambda_i)$
  • Link function: $m_i = \lambda_i$
  • Loss function: $f(x,m) = m - x \log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.RayleighLossType
RayleighLoss(eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Rayleigh data X with scale given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{Rayleigh}(\theta_i)$
  • Link function: $m_i = \sqrt{\frac{\pi}{2}\theta_i}$
  • Loss function: $f(x, m) = 2\log(m + \epsilon) + \frac{\pi}{4}(\frac{x}{m + \epsilon})^2$
  • Domain: $m \in [0, \infty)$
source