Loss functions

Work-in-progress

This page of the docs is still a work-in-progress. Check back later!

GCPDecompositions.GCPLosses.AbstractLossType
AbstractLoss

Abstract type for GCP loss functions $f(x,m)$, where $x$ is the data entry and $m$ is the model entry.

Concrete types ConcreteLoss <: AbstractLoss should implement:

  • value(loss::ConcreteLoss, x, m) that computes the value of the loss function $f(x,m)$
  • deriv(loss::ConcreteLoss, x, m) that computes the value of the partial derivative $\partial_m f(x,m)$ with respect to $m$
  • domain(loss::ConcreteLoss) that returns an Interval from IntervalSets.jl defining the domain for $m$
source
GCPDecompositions.GCPLosses.BernoulliLogitType
BernoulliLogit(eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Bernouli data X with log odds-success rate given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{Bernouli}(\rho_i)$
  • Link function: $m_i = \log(\frac{\rho_i}{1 - \rho_i})$
  • Loss function: $f(x, m) = \log(1 + e^m) - xm$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.GCPLosses.BernoulliOddsType
BernoulliOdds(eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Bernouli data X with odds-sucess rate given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{Bernouli}(\rho_i)$
  • Link function: $m_i = \frac{\rho_i}{1 - \rho_i}$
  • Loss function: $f(x, m) = \log(m + 1) - x\log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.GCPLosses.BetaDivergenceType
BetaDivergence(β::Real, eps::Real)

BetaDivergence Loss for given β
  • Loss function: $f(x, m; β) = \frac{1}{\beta}m^{\beta} - \frac{1}{\beta - 1}xm^{\beta - 1} if \beta \in \mathbb{R} \{0, 1\}, m - x\log(m) if \beta = 1, \frac{x}{m} + \log(m) if \beta = 0$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.GCPLosses.GammaType
Gamma(eps::Real = 1e-10)

Loss corresponding to a statistical assumption of Gamma-distributed data X with scale given by the low-rank model tensor M.

  • Distribution: $x_i \sim \operatorname{Gamma}(k, \sigma_i)$
  • Link function: $m_i = k \sigma_i$
  • Loss function: $f(x,m) = \frac{x}{m + \epsilon} + \log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.GCPLosses.HuberType
Huber(Δ::Real)

Huber Loss for given Δ

  • Loss function: $f(x, m) = (x - m)^2 if \abs(x - m)\leq\Delta, 2\Delta\abs(x - m) - \Delta^2 otherwise$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.GCPLosses.LeastSquaresType
LeastSquares()

Loss corresponding to conventional CP decomposition. Corresponds to a statistical assumption of Gaussian data X with mean given by the low-rank model tensor M.

  • Distribution: $x_i \sim \mathcal{N}(\mu_i, \sigma)$
  • Link function: $m_i = \mu_i$
  • Loss function: $f(x,m) = (x-m)^2$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.GCPLosses.NegativeBinomialOddsType
NegativeBinomialOdds(r::Integer, eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Negative Binomial data X with log odds failure rate given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{NegativeBinomial}(r, \rho_i)$
  • Link function: $m = \frac{\rho}{1 - \rho}$
  • Loss function: $f(x, m) = (r + x) \log(1 + m) - x\log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.GCPLosses.NonnegativeLeastSquaresType
NonnegativeLeastSquares()

Loss corresponding to nonnegative CP decomposition. Corresponds to a statistical assumption of Gaussian data X with nonnegative mean given by the low-rank model tensor M.

  • Distribution: $x_i \sim \mathcal{N}(\mu_i, \sigma)$
  • Link function: $m_i = \mu_i$
  • Loss function: $f(x,m) = (x-m)^2$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.GCPLosses.PoissonType
Poisson(eps::Real = 1e-10)

Loss corresponding to a statistical assumption of Poisson data X with rate given by the low-rank model tensor M.

  • Distribution: $x_i \sim \operatorname{Poisson}(\lambda_i)$
  • Link function: $m_i = \lambda_i$
  • Loss function: $f(x,m) = m - x \log(m + \epsilon)$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.GCPLosses.PoissonLogType
PoissonLog()

Loss corresponding to a statistical assumption of Poisson data X with log-rate given by the low-rank model tensor M.

  • Distribution: $x_i \sim \operatorname{Poisson}(\lambda_i)$
  • Link function: $m_i = \log \lambda_i$
  • Loss function: $f(x,m) = e^m - x m$
  • Domain: $m \in \mathbb{R}$
source
GCPDecompositions.GCPLosses.RayleighType
Rayleigh(eps::Real = 1e-10)

Loss corresponding to the statistical assumption of Rayleigh data X with sacle given by the low-rank model tensor M

  • Distribution: $x_i \sim \operatorname{Rayleigh}(\theta_i)$
  • Link function: $m_i = \sqrt{\frac{\pi}{2}\theta_i}$
  • Loss function: $f(x, m) = 2\log(m + \epsilon) + \frac{\pi}{4}(\frac{x}{m + \epsilon})^2$
  • Domain: $m \in [0, \infty)$
source
GCPDecompositions.GCPLosses.UserDefinedType
UserDefined

Type for user-defined loss functions $f(x,m)$, where $x$ is the data entry and $m$ is the model entry.

Contains three fields:

  1. func::Function : function that evaluates the loss function $f(x,m)$
  2. deriv::Function : function that evaluates the partial derivative $\partial_m f(x,m)$ with respect to $m$
  3. domain::Interval : Interval from IntervalSets.jl defining the domain for $m$

The constructor is UserDefined(func; deriv, domain). If not provided,

  • deriv is automatically computed from func using forward-mode automatic differentiation
  • domain gets a default value of Interval(-Inf, +Inf)
source
GCPDecompositions.GCPLosses.grad_U!Function
grad_U!(GU, M::CPD, X::AbstractArray, loss)

Compute the GCP gradient with respect to the factor matrices U = (U[1],...,U[N]) for the model tensor M, data tensor X, and loss function loss, and store the result in GU = (GU[1],...,GU[N]).

source