Title: | Kriging Modeling |
---|---|
Description: | An 'Eigen'-based computationally efficient 'C++' implementation for fitting various kriging models to data. This research is supported by U.S. National Science Foundation grant DMS-2310637. |
Authors: | Chaofan Huang [aut, cre], V. Roshan Joseph [aut] |
Maintainer: | Chaofan Huang <[email protected]> |
License: | GPL (>= 2) |
Version: | 1.0.1 |
Built: | 2025-02-15 03:39:11 UTC |
Source: | https://github.com/cran/rkriging |
This function computes the kernel (correlation) matrix.
Given the kernel class object and the input data of size n,
this function computes the corresponding
kernel (correlation) matrix.
Evaluate.Kernel(kernel, X)
Evaluate.Kernel(kernel, X)
kernel |
a kernel class object |
X |
input data |
The kernel (correlation) matrix of X evaluated by the kernel.
Chaofan Huang and V. Roshan Joseph
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) kernel <- Gaussian.Kernel(lengthscale) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) kernel <- Gaussian.Kernel(lengthscale) Evaluate.Kernel(kernel, X)
This function provides a common interface to fit various kriging models from the data. OK (Ordinary.Kriging), UK (Universal.Kriging), LK (Limit.Kriging), RK (Rational.Kriging), and GRK (Generalized.Rational.Kriging) are supported in this package.
Fit.Kriging( X, y, interpolation = TRUE, fit = TRUE, model = "OK", model.parameters = list(), kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
Fit.Kriging( X, y, interpolation = TRUE, fit = TRUE, model = "OK", model.parameters = list(), kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
X |
a matrix for input (feature) |
y |
a vector for output (target), only one-dimensional output is supported |
interpolation |
whether to interpolate, for noisy data please set |
fit |
whether to fit the length scale parameters from data |
model |
choice of kriging model: OK, UK, LK, RK, GRK |
model.parameters |
a list of parameters required for the specific kriging model, e.g. basis functions for universal kriging |
kernel |
a kernel class object |
kernel.parameters |
a list of parameters required for the kernel, if no kernel class object is provided |
nlopt.parameters |
a list of parameters required for NLopt, including choice of optimization algorithm and maximum number of evaluation |
Kriging gives the best linear unbiased predictor given the data. It can also be given a Bayesian interpretation by assuming a Gaussian process (GP) prior for the underlying function. Please see Santner et al. (2003), Rasmussen and Williams (2006), and Joseph (2024) for details.
For data from deterministic computer experiments, use interpolation=TRUE
and will give an interpolator.
For noisy data, use interpolation=FALSE
, which will give an approximator of the underlying function.
Currently, approximation is supported for OK (Ordinary.Kriging) and UK (Universal.Kriging).
The kernel choices are required and can be specified by
(i) providing the kernel class object to kernel
or (ii) specifying the kernel type and other parameters in kernel.parameters
.
Please see examples section for detail usages.
When the lengthscale / correlation parameters are known, simply provide them in
(i) the kernel class object to kernel
or (ii) in kernel.parameters
,
and set fit=FALSE
. The kriging model will be fitted with the user provided parameters.
When the lengthscale / correlation parameters are unknown,
they can be estimated via Maximum Likelihood method by setting fit=TRUE
.
The initial / lower bound / upper bound of the lengthscale parameters can be provided in kernel.parameters
,
otherwise a good initial and range would be estimated from the data.
The optimization is performed via NLopt,
a open-source library for nonlinear optimization.
All gradient-free optimization methods in NLopt
are supported and can be specified in nlopt.parameters
.
See nloptr::nloptr.print.options()
for the list of available derivative-free algorithms (prefix with NLOPT_GN or NLOPT_LN).
The maximum number of optimization steps can also be defined in nlopt.parameters
.
Please see examples section for detail usages.
A Kriging Class Object.
Chaofan Huang and V. Roshan Joseph
Joseph, V. R. (2006). Limit kriging. Technometrics, 48(4), 458-466.
Joseph, V. R. (2024). Rational Kriging. Journal of the American Statistical Association.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Santner, T. J., Williams, B. J., Notz, W. I., & Williams, B. J. (2003). The design and analysis of computer experiments (Vol. 1). New York: Springer.
Predict.Kriging, Get.Kriging.Parameters, Get.Kernel, Ordinary.Kriging, Universal.Kriging, Limit.Kriging, Rational.Kriging, Generalized.Rational.Kriging.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) ############################################################################# ################ Minimal Example for Fitting a Kriging Model ################ ############################################################################# # Ordinary Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Universal Kriging basis.function <- function(x) {c(1,x[1],x[1]^2)} kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="UK", model.parameters=list(basis.function=basis.function), kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Limit Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="LK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Rational Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="RK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Generalized Rational Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="GRK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ################ Fitting a Kriging Model with Kernel Object ################# ############################################################################# kernel <- Gaussian.Kernel(rep(1,p)) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel=kernel) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ############### Creating a Kriging Model with Kernel Object ################# ############################################################################# # set fit = FALSE to create Kriging model with user provided kernel # no optimization for the length scale parameters kernel <- Gaussian.Kernel(rep(1e-1,p)) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=FALSE, model="OK", kernel=kernel) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ############ Fitting a Kriging Model with Range for Lengthscale ############# ############################################################################# kernel.parameters <- list( type = 'Gaussian', lengthscale = rep(1,p), # initial value lengthscale.lower.bound = rep(1e-3,p), # lower bound lengthscale.upper.bound = rep(1e3,p) # upper bound ) # if not provided, a good estimate would be computed from data kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=kernel.parameters) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ########### Fitting a Kriging Model with Different Optimization ############# ############################################################################# nlopt.parameters <- list( algorithm = 'NLOPT_GN_DIRECT', # optimization method maxeval = 250 # maximum number of evaluation ) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian"), nlopt.parameters=nlopt.parameters) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Global-Local Optimization nlopt.parameters <- list( algorithm = 'NLOPT_GN_MLSL_LDS', # optimization method local.algorithm = 'NLOPT_LN_SBPLX', # local algorithm maxeval = 250 # maximum number of evaluation ) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian"), nlopt.parameters=nlopt.parameters) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ################# Fitting a Kriging Model from Noisy Data ################### ############################################################################# y <- y + 0.1 * rnorm(length(y)) kriging <- Fit.Kriging(X, y, interpolation=FALSE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) ############################################################################# ################ Minimal Example for Fitting a Kriging Model ################ ############################################################################# # Ordinary Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Universal Kriging basis.function <- function(x) {c(1,x[1],x[1]^2)} kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="UK", model.parameters=list(basis.function=basis.function), kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Limit Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="LK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Rational Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="RK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Generalized Rational Kriging kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="GRK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ################ Fitting a Kriging Model with Kernel Object ################# ############################################################################# kernel <- Gaussian.Kernel(rep(1,p)) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel=kernel) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ############### Creating a Kriging Model with Kernel Object ################# ############################################################################# # set fit = FALSE to create Kriging model with user provided kernel # no optimization for the length scale parameters kernel <- Gaussian.Kernel(rep(1e-1,p)) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=FALSE, model="OK", kernel=kernel) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ############ Fitting a Kriging Model with Range for Lengthscale ############# ############################################################################# kernel.parameters <- list( type = 'Gaussian', lengthscale = rep(1,p), # initial value lengthscale.lower.bound = rep(1e-3,p), # lower bound lengthscale.upper.bound = rep(1e3,p) # upper bound ) # if not provided, a good estimate would be computed from data kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=kernel.parameters) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ########### Fitting a Kriging Model with Different Optimization ############# ############################################################################# nlopt.parameters <- list( algorithm = 'NLOPT_GN_DIRECT', # optimization method maxeval = 250 # maximum number of evaluation ) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian"), nlopt.parameters=nlopt.parameters) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # Global-Local Optimization nlopt.parameters <- list( algorithm = 'NLOPT_GN_MLSL_LDS', # optimization method local.algorithm = 'NLOPT_LN_SBPLX', # local algorithm maxeval = 250 # maximum number of evaluation ) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian"), nlopt.parameters=nlopt.parameters) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) ############################################################################# ################# Fitting a Kriging Model from Noisy Data ################### ############################################################################# y <- y + 0.1 * rnorm(length(y)) kriging <- Fit.Kriging(X, y, interpolation=FALSE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
This function specifies the Gaussian / Squared Exponential (SE) / Radial Basis Function (RBF) kernel.
Gaussian.Kernel(lengthscale)
Gaussian.Kernel(lengthscale)
lengthscale |
a vector for the positive length scale parameters |
The Gaussian kernel is given by
where
is the euclidean distance between and
weighted by
the length scale parameters
's.
A Gaussian Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Gaussian.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Gaussian") Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Gaussian.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Gaussian") Evaluate.Kernel(kernel, X)
This functions fits the generalized rational kriging model to the data.
Generalized.Rational.Kriging( X, y, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
Generalized.Rational.Kriging( X, y, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
X |
a matrix for input (feature) |
y |
a vector for output (target), only one-dimensional output is supported |
fit |
whether to fit the length scale parameters from data |
kernel |
a kernel class object |
kernel.parameters |
a list of parameters required for the kernel, if no kernel class object is provided |
nlopt.parameters |
a list of parameters required for NLopt, including choice of optimization algorithm and maximum number of evaluation |
Ordinary kriging and rational kriging can be obtained as special cases of generalized rational kriging. Please see Joseph (2024) for details. The Spectra library is used for fast computation of the first eigenvalues/vectors. Only interpolation is available. Noisy output is not supported for generalized rational kriging.
The kernel choices are required and can be specified by
(i) providing the kernel class object to kernel
or (ii) specifying the kernel type and other parameters in kernel.parameters
.
Please see examples section of Fit.Kriging for detail usages.
When the lengthscale / correlation parameters are unknown,
all parameters including the constant mean can be estimated via Maximum Likelihood method by setting fit=TRUE
.
The initial / lower bound / upper bound of the lengthscale parameters can be provided in kernel.parameters
,
otherwise a good initial and range would be estimated from the data.
The optimization is performed via NLopt library,
a open-source library for nonlinear optimization.
All gradient-free optimization methods in NLopt
are supported and can be specified in nlopt.parameters
.
See nloptr::nloptr.print.options()
for the list of available derivative-free algorithms (prefix with NLOPT_GN or NLOPT_LN).
The maximum number of optimization steps can also be defined in nlopt.parameters
.
Please see examples section of Fit.Kriging for detail usages.
A Generalized Rational Kriging Class Object.
Chaofan Huang and V. Roshan Joseph
Joseph, V. R. (2024). Rational Kriging. Journal of the American Statistical Association.
Qiu, Y., Guennebaud, G., & Niesen, J. (2015). Spectra: C++ library for large scale eigenvalue problems.
Fit.Kriging, Predict.Kriging, Get.Kriging.Parameters.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Generalized.Rational.Kriging(X, y, fit=TRUE, kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="GRK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Generalized.Rational.Kriging(X, y, fit=TRUE, kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="GRK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
This function provides a common interface to specify various kernels. See arguments section for the available kernels in this package.
Get.Kernel(lengthscale, type, parameters = list())
Get.Kernel(lengthscale, type, parameters = list())
lengthscale |
a vector for the positive length scale parameters |
type |
kernel type: Gaussian, RQ, Matern12, Matern32, Matern52, Matern, UDF, MultiplicativeRQ, MultiplicativeMatern, MultiplicativeUDF |
parameters |
a list of parameters required for the specific kernel |
A Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Evaluate.Kernel, Gaussian.Kernel, RQ.Kernel, Matern12.Kernel, Matern32.Kernel, Matern52.Kernel Matern.Kernel, UDF.Kernel, MultiplicativeRQ.Kernel, MultiplicativeMatern.Kernel, MultiplicativeUDF.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # Gaussian kernel <- Get.Kernel(lengthscale, type="Gaussian") Evaluate.Kernel(kernel, X) # Rational Quadratic (RQ) kernel <- Get.Kernel(lengthscale, type="RQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X) # Matern(1/2) kernel <- Get.Kernel(lengthscale, type="Matern12") Evaluate.Kernel(kernel, X) # Matern(3/2) kernel <- Get.Kernel(lengthscale, type="Matern32") Evaluate.Kernel(kernel, X) # Matern(5/2) kernel <- Get.Kernel(lengthscale, type="Matern52") Evaluate.Kernel(kernel, X) # Generalized Matern kernel <- Get.Kernel(lengthscale, type="Matern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X) # User Defined Function (UDF) Kernel kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} kernel <- Get.Kernel(lengthscale, type="UDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X) # Multiplicative Rational Quadratic (RQ) kernel <- Get.Kernel(lengthscale, type="MultiplicativeRQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X) # Multiplicative Generalized Matern kernel <- Get.Kernel(lengthscale, type="MultiplicativeMatern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X) # Multiplicative User Defined Function (UDF) kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} kernel <- Get.Kernel(lengthscale, type="MultiplicativeUDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # Gaussian kernel <- Get.Kernel(lengthscale, type="Gaussian") Evaluate.Kernel(kernel, X) # Rational Quadratic (RQ) kernel <- Get.Kernel(lengthscale, type="RQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X) # Matern(1/2) kernel <- Get.Kernel(lengthscale, type="Matern12") Evaluate.Kernel(kernel, X) # Matern(3/2) kernel <- Get.Kernel(lengthscale, type="Matern32") Evaluate.Kernel(kernel, X) # Matern(5/2) kernel <- Get.Kernel(lengthscale, type="Matern52") Evaluate.Kernel(kernel, X) # Generalized Matern kernel <- Get.Kernel(lengthscale, type="Matern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X) # User Defined Function (UDF) Kernel kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} kernel <- Get.Kernel(lengthscale, type="UDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X) # Multiplicative Rational Quadratic (RQ) kernel <- Get.Kernel(lengthscale, type="MultiplicativeRQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X) # Multiplicative Generalized Matern kernel <- Get.Kernel(lengthscale, type="MultiplicativeMatern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X) # Multiplicative User Defined Function (UDF) kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} kernel <- Get.Kernel(lengthscale, type="MultiplicativeUDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X)
This function can be used for extracting the estimates of the kriging parameters.
Get.Kriging.Parameters(kriging)
Get.Kriging.Parameters(kriging)
kriging |
a kriging class object |
nllh |
negative log-likelihood of the kriging model |
mu |
mean of the kriging model |
nu2 |
variance of the kriging model |
sigma2 |
variance of the random noise when |
beta |
coefficients of the basis functions for universal kriging |
c |
c for the rational / generalized rational kriging, see Joseph (2024) |
c0 |
c0 for the generalized rational kriging, see Joseph (2024) |
Chaofan Huang and V. Roshan Joseph
Joseph, V. R. (2006). Limit kriging. Technometrics, 48(4), 458-466.
Joseph, V. R. (2024). Rational Kriging. Journal of the American Statistical Association.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Santner, T. J., Williams, B. J., Notz, W. I., & Williams, B. J. (2003). The design and analysis of computer experiments (Vol. 1). New York: Springer.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) Get.Kriging.Parameters(kriging)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) Get.Kriging.Parameters(kriging)
This functions fits the limit kriging model to the data.
Limit.Kriging( X, y, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
Limit.Kriging( X, y, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
X |
a matrix for input (feature) |
y |
a vector for output (target), only one-dimensional output is supported |
fit |
whether to fit the length scale parameters from data |
kernel |
a kernel class object |
kernel.parameters |
a list of parameters required for the kernel, if no kernel class object is provided |
nlopt.parameters |
a list of parameters required for NLopt, including choice of optimization algorithm and maximum number of evaluation |
Limit kriging avoids the mean reversion issue of ordinary kriging. Please see Joseph (2006) for details. Only interpolation is available. Noisy output is not supported for limit kriging.
The kernel choices are required and can be specified by
(i) providing the kernel class object to kernel
or (ii) specifying the kernel type and other parameters in kernel.parameters
.
Please see examples section of Fit.Kriging for detail usages.
When the lengthscale / correlation parameters are unknown,
all parameters including the constant mean can be estimated via Maximum Likelihood method by setting fit=TRUE
.
The initial / lower bound / upper bound of the lengthscale parameters can be provided in kernel.parameters
,
otherwise a good initial and range would be estimated from the data.
The optimization is performed via NLopt,
a open-source library for nonlinear optimization.
All gradient-free optimization methods in NLopt
are supported and can be specified in nlopt.parameters
.
See nloptr::nloptr.print.options()
for the list of available derivative-free algorithms (prefix with NLOPT_GN or NLOPT_LN).
The maximum number of optimization steps can also be defined in nlopt.parameters
.
Please see examples section of Fit.Kriging for detail usages.
A Limit Kriging Class Object.
Chaofan Huang and V. Roshan Joseph
Joseph, V. R. (2006). Limit kriging. Technometrics, 48(4), 458-466.
Fit.Kriging, Predict.Kriging, Get.Kriging.Parameters.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Limit.Kriging(X, y, fit=TRUE, kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="LK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Limit.Kriging(X, y, fit=TRUE, kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="LK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
This function specifies the (Generalized) Matern kernel with any smoothness parameter .
Matern.Kernel(lengthscale, nu = 2.01)
Matern.Kernel(lengthscale, nu = 2.01)
lengthscale |
a vector for the positive length scale parameters |
nu |
a positive scalar parameter that controls the smoothness |
The Generalized Matern kernel is given by
where is the smoothness parameter,
is the modified Bessel function,
is the gamma function,
and
is the euclidean distance between and
weighted by
the length scale parameters
's.
As
, it converges to the Gaussian.Kernel.
A Generalized Matern Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Matern12.Kernel, Matern32.Kernel, Matern52.Kernel, MultiplicativeMatern.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern.Kernel(lengthscale, nu=2.01) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern.Kernel(lengthscale, nu=2.01) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X)
This function specifies the Matern kernel with smoothness parameter =1/2.
It is also known as the Exponential kernel.
Matern12.Kernel(lengthscale)
Matern12.Kernel(lengthscale)
lengthscale |
a vector for the positive length scale parameters |
The Matern(1/2) kernel is given by
where
is the euclidean distance between and
weighted by
the length scale parameters
's.
A Matern(1/2) Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Matern.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern12.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern12") Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern12.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern12") Evaluate.Kernel(kernel, X)
This function specifies the Matern kernel with smoothness parameter =3/2.
Matern32.Kernel(lengthscale)
Matern32.Kernel(lengthscale)
lengthscale |
a vector for the positive length scale parameters |
The Matern(3/2) kernel is given by
where
is the euclidean distance between and
weighted by
the length scale parameters
's.
A Matern(3/2) Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Matern.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern32.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern32") Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern32.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern32") Evaluate.Kernel(kernel, X)
This function specifies the Matern kernel with smoothness parameter =5/2.
Matern52.Kernel(lengthscale)
Matern52.Kernel(lengthscale)
lengthscale |
a vector for the positive length scale parameters |
The Matern(5/2) kernel is given by
where
is the euclidean distance between and
weighted by
the length scale parameters
's.
A Matern(5/2) Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Matern.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern52.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern52") Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- Matern52.Kernel(lengthscale) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="Matern52") Evaluate.Kernel(kernel, X)
This function specifies the Multiplicative Generalized Matern kernel.
MultiplicativeMatern.Kernel(lengthscale, nu = 2.01)
MultiplicativeMatern.Kernel(lengthscale, nu = 2.01)
lengthscale |
a vector for the positive length scale parameters |
nu |
a positive scalar parameter that controls the smoothness |
The Multiplicative Generalized Matern kernel is given by
where is the smoothness parameter,
is the modified Bessel function,
is the gamma function,
and
is the dimension-wise euclidean distances between and
weighted by
the length scale parameters
's.
A Multiplicative Generalized Matern Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Matern.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- MultiplicativeMatern.Kernel(lengthscale, nu=2.01) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="MultiplicativeMatern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- MultiplicativeMatern.Kernel(lengthscale, nu=2.01) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="MultiplicativeMatern", parameters=list(nu=2.01)) Evaluate.Kernel(kernel, X)
This function specifies the Multiplicative Rational Quadratic (RQ) kernel.
MultiplicativeRQ.Kernel(lengthscale, alpha = 1)
MultiplicativeRQ.Kernel(lengthscale, alpha = 1)
lengthscale |
a vector for the positive length scale parameters |
alpha |
a positive scalar for the scale mixture parameter that controls the relative weighting of large-scale and small-scale variations |
The Multiplicative Rational Quadratic (RQ) kernel is given by
where is the scale mixture parameter and
is the dimension-wise euclidean distances between and
weighted by
the length scale parameters
's.
A Multiplicative Rational Quadratic (RQ) Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
RQ.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- MultiplicativeRQ.Kernel(lengthscale, alpha=1) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="MultiplicativeRQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- MultiplicativeRQ.Kernel(lengthscale, alpha=1) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="MultiplicativeRQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X)
This function specifies the Multiplicative kernel with the user defined R function.
MultiplicativeUDF.Kernel(lengthscale, kernel.function)
MultiplicativeUDF.Kernel(lengthscale, kernel.function)
lengthscale |
a vector for the positive length scale parameters |
kernel.function |
user defined kernel function |
The Multiplicative User Defined Function (UDF) kernel is given by
where is the user defined kernel function that takes
as input,
where
is the dimension-wise euclidean distances between and
weighted by
the length scale parameters
's.
A Multiplicative User Defined Function (UDF) Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
UDF.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} # approach 1 kernel <- MultiplicativeUDF.Kernel(lengthscale, kernel.function=kernel.function) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="MultiplicativeUDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} # approach 1 kernel <- MultiplicativeUDF.Kernel(lengthscale, kernel.function=kernel.function) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="MultiplicativeUDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X)
This functions fits the ordinary kriging model to the data.
Ordinary.Kriging( X, y, interpolation = TRUE, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
Ordinary.Kriging( X, y, interpolation = TRUE, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
X |
a matrix for input (feature) |
y |
a vector for output (target), only one-dimensional output is supported |
interpolation |
interpolation whether to interpolate, for noisy data please set |
fit |
whether to fit the length scale parameters from data |
kernel |
a kernel class object |
kernel.parameters |
a list of parameters required for the kernel, if no kernel class object is provided |
nlopt.parameters |
a list of parameters required for NLopt, including choice of optimization algorithm and maximum number of evaluation |
Ordinary kriging assumes a constant mean. Please see Santner et al. (2003) for details.
For data from deterministic computer experiments, use interpolation=TRUE
and will give an interpolator.
For noisy data, use interpolation=FALSE
, which will give an approximator of the underlying function.
The kernel choices are required and can be specified by
(i) providing the kernel class object to kernel
or (ii) specifying the kernel type and other parameters in kernel.parameters
.
Please see examples section of Fit.Kriging for detail usages.
When the lengthscale / correlation parameters are unknown,
all parameters including the constant mean can be estimated via Maximum Likelihood method by setting fit=TRUE
.
The initial / lower bound / upper bound of the lengthscale parameters can be provided in kernel.parameters
,
otherwise a good initial and range would be estimated from the data.
The optimization is performed via NLopt,
a open-source library for nonlinear optimization.
All gradient-free optimization methods in NLopt
are supported and can be specified in nlopt.parameters
.
See nloptr::nloptr.print.options()
for the list of available derivative-free algorithms (prefix with NLOPT_GN or NLOPT_LN).
The maximum number of optimization steps can also be defined in nlopt.parameters
.
Please see examples section of Fit.Kriging for detail usages.
A Ordinary Kriging Class Object.
Chaofan Huang and V. Roshan Joseph
Santner, T. J., Williams, B. J., Notz, W. I., & Williams, B. J. (2003). The design and analysis of computer experiments (Vol. 1). New York: Springer.
Fit.Kriging, Predict.Kriging, Get.Kriging.Parameters.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Ordinary.Kriging(X, y, interpolation=TRUE, fit=TRUE, kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Ordinary.Kriging(X, y, interpolation=TRUE, fit=TRUE, kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
This function gives prediction and uncertainty quantification of the kriging model on a new input.
Predict.Kriging(kriging, X)
Predict.Kriging(kriging, X)
kriging |
a kriging class object |
X |
a matrix for the new input (features) to perform predictions |
mean |
kriging mean computed at the new input |
sd |
kriging standard computed at the new input |
Chaofan Huang and V. Roshan Joseph
Joseph, V. R. (2006). Limit kriging. Technometrics, 48(4), 458-466.
Joseph, V. R. (2024). Rational Kriging. Journal of the American Statistical Association.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
Santner, T. J., Williams, B. J., Notz, W. I., & Williams, B. J. (2003). The design and analysis of computer experiments (Vol. 1). New York: Springer.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="OK", kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3)
This functions fits the rational kriging model to the data.
Rational.Kriging( X, y, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
Rational.Kriging( X, y, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
X |
a matrix for input (feature) |
y |
a vector for output (target), only one-dimensional output is supported |
fit |
whether to fit the length scale parameters from data |
kernel |
a kernel class object |
kernel.parameters |
a list of parameters required for the kernel, if no kernel class object is provided |
nlopt.parameters |
a list of parameters required for NLopt, including choice of optimization algorithm and maximum number of evaluation |
Rational kriging gives a rational predictor in terms of the inputs, which gives a more stable estimate of the mean. Please see Joseph (2024) for details. Only interpolation is available. Noisy output is not supported for rational kriging.
The kernel choices are required and can be specified by
(i) providing the kernel class object to kernel
or (ii) specifying the kernel type and other parameters in kernel.parameters
.
Please see examples section of Fit.Kriging for detail usages.
When the lengthscale / correlation parameters are unknown,
all parameters including the constant mean can be estimated via Maximum Likelihood method by setting fit=TRUE
.
The initial / lower bound / upper bound of the lengthscale parameters can be provided in kernel.parameters
,
otherwise a good initial and range would be estimated from the data.
The optimization is performed via NLopt,
a open-source library for nonlinear optimization.
All gradient-free optimization methods in NLopt
are supported and can be specified in nlopt.parameters
.
See nloptr::nloptr.print.options()
for the list of available derivative-free algorithms (prefix with NLOPT_GN or NLOPT_LN).
The maximum number of optimization steps can also be defined in nlopt.parameters
.
Please see examples section of Fit.Kriging for detail usages.
A Rational Kriging Class Object.
Chaofan Huang and V. Roshan Joseph
Joseph, V. R. (2024). Rational Kriging. Journal of the American Statistical Association.
Fit.Kriging, Predict.Kriging, Get.Kriging.Parameters.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Rational.Kriging(X, y, fit=TRUE, kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="RK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) # approach 1 kriging <- Rational.Kriging(X, y, fit=TRUE, kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="RK", kernel.parameters=list(type="RQ",alpha=1)) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
This function specifies the Rational Quadratic (RQ) kernel.
RQ.Kernel(lengthscale, alpha = 1)
RQ.Kernel(lengthscale, alpha = 1)
lengthscale |
a vector for the positive length scale parameters |
alpha |
a positive scalar for the scale mixture parameter that controls the relative weighting of large-scale and small-scale variations |
The Rational Quadratic (RQ) kernel is given by
where is the scale mixture parameter and
is the euclidean distance between and
weighted by
the length scale parameters
's.
As
, it converges to the Gaussian.Kernel.
A Rational Quadratic (RQ) Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
MultiplicativeRQ.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- RQ.Kernel(lengthscale, alpha=1) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="RQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) # approach 1 kernel <- RQ.Kernel(lengthscale, alpha=1) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="RQ", parameters=list(alpha=1)) Evaluate.Kernel(kernel, X)
This function specifies a kernel with the user defined R function.
UDF.Kernel(lengthscale, kernel.function)
UDF.Kernel(lengthscale, kernel.function)
lengthscale |
a vector for the positive length scale parameters |
kernel.function |
user defined kernel function |
The User Defined Function (UDF) kernel is given by
where is the user defined kernel function that takes
as input,
where
is the euclidean distance between and
weighted by
the length scale parameters
's.
A User Defined Function (UDF) Kernel Class Object.
Chaofan Huang and V. Roshan Joseph
Duvenaud, D. (2014). The kernel cookbook: Advice on covariance functions.
Rasmussen, C. E. & Williams, C. K. (2006). Gaussian Processes for Machine Learning. The MIT Press.
MultiplicativeUDF.Kernel, Get.Kernel, Evaluate.Kernel.
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} # approach 1 kernel <- UDF.Kernel(lengthscale, kernel.function=kernel.function) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="UDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X)
n <- 5 p <- 3 X <- matrix(rnorm(n*p), ncol=p) lengthscale <- c(1:p) kernel.function <- function(sqdist) {return (exp(-sqrt(sqdist)))} # approach 1 kernel <- UDF.Kernel(lengthscale, kernel.function=kernel.function) Evaluate.Kernel(kernel, X) # approach 2 kernel <- Get.Kernel(lengthscale, type="UDF", parameters=list(kernel.function=kernel.function)) Evaluate.Kernel(kernel, X)
This functions fits the universal kriging model to the data.
Universal.Kriging( X, y, basis.function, interpolation = TRUE, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
Universal.Kriging( X, y, basis.function, interpolation = TRUE, fit = TRUE, kernel = NULL, kernel.parameters = list(), nlopt.parameters = list() )
X |
a matrix for input (feature) |
y |
a vector for output (target), only one-dimensional output is supported |
basis.function |
the basis functions for specifying the prior mean |
interpolation |
interpolation whether to interpolate, for noisy data please set |
fit |
whether to fit the length scale parameters from data |
kernel |
a kernel class object |
kernel.parameters |
a list of parameters required for the kernel, if no kernel class object is provided |
nlopt.parameters |
a list of parameters required for NLopt, including choice of optimization algorithm and maximum number of evaluation |
Universal kriging permits a more general function of mean, which can be specified using basis.function
.
Please see Santner et al. (2003) for details.
For data from deterministic computer experiments, use interpolation=TRUE
and will give an interpolator.
For noisy data, use interpolation=FALSE
, which will give an approximator of the underlying function.
The kernel choices are required and can be specified by
(i) providing the kernel class object to kernel
or (ii) specifying the kernel type and other parameters in kernel.parameters
.
Please see examples section of Fit.Kriging for detail usages.
When the lengthscale / correlation parameters are unknown,
all parameters including the constant mean can be estimated via Maximum Likelihood method by setting fit=TRUE
.
The initial / lower bound / upper bound of the lengthscale parameters can be provided in kernel.parameters
,
otherwise a good initial and range would be estimated from the data.
The optimization is performed via NLopt,
a open-source library for nonlinear optimization.
All gradient-free optimization methods in NLopt
are supported and can be specified in nlopt.parameters
.
See nloptr::nloptr.print.options()
for the list of available derivative-free algorithms (prefix with NLOPT_GN or NLOPT_LN).
The maximum number of optimization steps can also be defined in nlopt.parameters
.
Please see examples section of Fit.Kriging for detail usages.
A Universal Kriging Class Object.
Chaofan Huang and V. Roshan Joseph
Santner, T. J., Williams, B. J., Notz, W. I., & Williams, B. J. (2003). The design and analysis of computer experiments (Vol. 1). New York: Springer.
Fit.Kriging, Predict.Kriging, Get.Kriging.Parameters.
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) basis.function <- function(x) {c(1,x[1],x[1]^2)} # approach 1 kriging <- Universal.Kriging(X, y, basis.function=basis.function, interpolation=TRUE, fit=TRUE, kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="UK", model.parameters=list(basis.function=basis.function), kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)
# one dimensional example f <- function(x) { x <- 0.5 + 2*x y <- sin(10*pi*x)/(2*x) + (x-1)^4 return (y) } set.seed(1234) # train set n <- 30 p <- 1 X <- matrix(runif(n),ncol=p) y <- apply(X, 1, f) newX <- matrix(seq(0,1,length=1001), ncol=p) basis.function <- function(x) {c(1,x[1],x[1]^2)} # approach 1 kriging <- Universal.Kriging(X, y, basis.function=basis.function, interpolation=TRUE, fit=TRUE, kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging) # approach 2 kriging <- Fit.Kriging(X, y, interpolation=TRUE, fit=TRUE, model="UK", model.parameters=list(basis.function=basis.function), kernel.parameters=list(type="Gaussian")) pred <- Predict.Kriging(kriging, newX) plot(newX, f(newX), "l") points(X, y, pch=16, col="blue") lines(newX, pred$mean, col="red", lty=2) lines(newX, pred$mean-2*pred$sd, col="red", lty=3) lines(newX, pred$mean+2*pred$sd, col="red", lty=3) Get.Kriging.Parameters(kriging)