# Curves Fits

Sphinx allows you to add custom curves to your data to help you fit and understand your data. These curves are useful if you want to determine curve fit parameters and have them available in a data table. Multipe types of curve fits are available, including:

- Dose-response
- Standard
- Kinetic
- Simple Linear Regression
- Weighted Linear Regression

If you want an additional curve fit not listed here, please contact support.

# Dose-response

A dose-response curve is used to show the relationship between the dose or concentration of a substance and the response. It helps determine the potency of a drug or the toxicity of a substance. Multiple types of dose-response curves are available, as described in Curve Fitting.

To add a dose-response curve, select the “Dose-response” option from the curve fit dropdown menu in the “Analyze” menu. From there, you can define define the parameters for the curve fit.

If you add a group, a separate curve will be fit for each group. To read more about the dose-response curve, see Curve Fitting.

# Standard

A standard curve is a graphical representation of the relationship between known concentrations of a substance and their measured values. It is used to quantify unknown concentrations in samples.

To add a standard curve, select the “Standard” option from the curve fit dropdown menu in the “Analyze” menu. From there, you can define define the parameters for the curve fit.

If you add a group, a separate curve will be fit for each group.

# Kinetic

A kinetic curve describes the change in a system over time. It is commonly used in chemical reactions to understand reaction rates and mechanisms.

To add a kinetic curve, select the “Kinetic” option from the curve fit dropdown menu in the “Analyze” menu. From there, you can define define the parameters for the curve fit.

If you add a group, a separate curve will be fit for each group.

# Linear Regression

**(Simple) Linear Regression:** This is used when you assume all data points are equally reliable and have the same level of uncertainty.
It is the standard approach for regression when no additional information about the varying quality of data points is available.

**Weighted Linear Regression:** This is appropriate when you have prior knowledge that some data points should have more influence than others.
For example, in heteroscedasticity (where the variance of errors differs across data points), you might assign lower weights to data points with higher variance to reduce their influence on the model.
Another example is in meta-analysis, where studies with larger sample sizes (and therefore more reliable results) are given more weight.

## Simple Linear Regression

This is also known as ordinary least squares (OLS) regression. The goal is to find a linear relationship (e.g., $y = \beta_0 + \beta_1 x$) that minimizes the sum of the squared differences (residuals) between the observed values and the predicted values. Mathematically, it minimizes the following objective function:

$\text{Objective: } \min*{\beta_0, \beta_1} \sum*{i=1}^{n} (y_i - (\beta_0 + \beta_1 x_i))^2$Here, each residual $(y_i - (\beta_0 + \beta_1 x_i))$ is squared, and these squared residuals are summed across all data points.
There is no differentiation in importance between the data points; every residual contributes **equally** to the total sum of squared errors (SSE).

To add a simple linear regression curve, select the “Simple Linear Regression” option from the curve fit dropdown menu in the “Analyze” menu. From there, you can define define the parameters for the curve fit.

If you add a group, a separate curve will be fit for each group.

## Weighted Linear Regression

Weighted linear regression is similar to simple linear regression but assigns different weights to data points based on their importance or error. In weighted linear regression, each data point is assigned a weight that determines its influence on the model. The goal is still to minimize the sum of the squared residuals, but now each residual is multiplied by its corresponding weight. The objective function becomes:

$\text{Objective: } \min*{\beta_0, \beta_1} \sum*{i=1}^{n} w_i \cdot (y_i - (\beta_0 + \beta_1 x_i))^2$Here, $w_i$ is the weight associated with the $i$-th data point. A higher weight $w_i$ means that the corresponding data point will have a larger influence on the model, while a lower weight reduces its influence. The application of weights can be useful in scenarios where some observations are deemed more reliable or important than others, or when variances across observations are not constant (heteroscedasticity).

You can use this curve fit to account for different levels of error in your data.

To add a weighted linear regression curve, select the “Weighted Linear Regression” option from the curve fit dropdown menu in the “Analyze” menu. From there, you can define define the parameters for the curve fit.

In this menu you can also define the weights for each data point or allow Sphinx to automatically calculate the weights based on the data (using the inverse of the variance of the data). If you add a group, a separate curve will be fit for each group.