Type: Package
Title: Make Symmetric and Asymmetric ARDL Estimations
Version: 1.1.0
Maintainer: Huseyin Karamelikli <hakperest@gmail.com>
Description: Implements estimation procedures for Autoregressive Distributed Lag (ARDL) and Nonlinear ARDL (NARDL) models, which allow researchers to investigate both short- and long-run relationships in time series data under mixed orders of integration. The package supports simultaneous modeling of symmetric and asymmetric regressors, flexible treatment of short-run and long-run asymmetries, and automated equation handling. It includes several cointegration testing approaches such as the Pesaran-Shin-Smith F and t bounds tests, and the restricted ECM test. Methodological foundations are provided in Pesaran, Shin, and Smith (2001) <doi:10.1016/S0304-4076(01)00049-5> and Shin, Yu, and Greenwood-Nimmo (2014, ISBN:9780123855079).
License: GPL-3
Encoding: UTF-8
LazyData: true
Imports: stats, msm, lmtest, nlWaldTest, car, ggplot2, utils
RoxygenNote: 7.3.3
Suggests: knitr, rmarkdown, officer, flextable, equatags, magrittr, rlang, tidyr, dplyr, testthat (≥ 3.0.0)
NeedsCompilation: no
VignetteBuilder: knitr
Packaged: 2026-03-09 10:29:54 UTC; huseyin
Author: Huseyin Karamelikli ORCID iD [aut, cre], Huseyin Utku Demir ORCID iD [aut]
Depends: R (≥ 3.5.0)
Config/testthat/edition: 3
Repository: CRAN
Date/Publication: 2026-03-09 11:00:02 UTC

Produce Bootstrap Confidence Intervals for Dynamic Multipliers

Description

This function computes bootstrap confidence intervals (CI) for dynamic multipliers of a specified variable in a model estimated using the kardl package. The bootstrap method generates resampled datasets to estimate the variability of the dynamic multipliers, providing upper and lower bounds for the confidence interval.

Usage

bootstrap(kmodel, horizon = 80, replications = 100, level = 95, minProb = 0)

Arguments

kmodel

The model produced by the kardl function. This is the model object from which the dynamic multipliers are calculated.

horizon

An integer specifying the horizon over which dynamic multipliers will be computed. The horizon defines the time frame for the analysis (e.g., 40 periods).

replications

An integer indicating the number of bootstrap replications to perform. Higher values increase accuracy but also computational time. Default is 100.

level

A numeric value specifying the confidence level for the intervals (e.g., 95 for 95 Default is 90.

minProb

A numeric value specifying the minimum p-value threshold for including coefficients in the bootstrap. Coefficients with p-values above this threshold will be set to zero in the bootstrap samples. Default is 0 (no threshold). This parameter allows users to control the inclusion of coefficients in the bootstrap process based on their statistical significance. Setting a threshold can help focus the analysis on more relevant variables, but it may also exclude potentially important effects if set too stringently.

Details

The mpsi component of the output contains the dynamic multiplier estimates along with their upper and lower confidence intervals. These values are provided for each variable and at each time horizon.

Value

A list containing the following elements:

See Also

mplier for calculating dynamic multipliers

Examples


library(dplyr)

  # Example usage of the bootstrap function


 # Fit a model using kardl
 kardl_model <- kardl(imf_example_data,
                      CPI ~ ER + PPI + asy(ER) +
                       det(covid) + trend,
                      mode = c(1, 2, 3, 0))

 # Perform bootstrap with specific variables for plotting
 boot <-
   bootstrap(kardl_model,   replications=5)
 # The boot object will include all plots for the specified variables
 # Displaying the boot object provides an overview of its components
 names(boot)


 # Inspect the first few rows of the dynamic multiplier estimates
  head(boot$mpsi)


  summary(boot)

 # Retrieve plots generated during the bootstrap process
 # Accessing all plots
  plot(boot)

 # Accessing the plot for a specific variable by its name
  plot(boot, variable = "PPI")
 plot(boot, variable = "ER")



 # Using dplyr
 library(dplyr)

   imf_example_data %>% kardl( CPI ~ PPI + asym(ER) +trend, maxlag=2) %>%
   bootstrap(replications=5) %>% plot(variable = "ER")



Perform the Error Correction Model (ECM) test to assess cointegration in the model

Description

This function is used to perform the Error Correction Model (ECM) test, which is designed to determine whether there is cointegration in the model. Cointegration indicates a long-term equilibrium relationship between variables, despite short-term deviations. The ECM test helps identify if such a long-term relationship exists by examining the short-run dynamics and adjusting for deviations from equilibrium. If the test confirms cointegration, it suggests that the variables move together over time, maintaining a stable long-term relationship. This is critical for ensuring that the model properly captures both short-term fluctuations and long-term equilibrium behavior.

Usage

ecm(
  data = NULL,
  formula = NULL,
  case = 3,
  maxlag = NULL,
  mode = NULL,
  criterion = NULL,
  differentAsymLag = NULL,
  batch = NULL,
  ...
)

Arguments

data

The data of analysis

formula

A formula specifying the long-run model equation. This formula defines the relationships between the dependent variable and explanatory variables, including options for deterministic terms, asymmetric variables, and a trend component.

Example formula: y ~ x + z + Asymmetric(z) + Lasymmetric(x2 + x3) + Sasymmetric(x3 + x4) + deterministic(dummy1 + dummy2) + trend

Details

The formula allows flexible specification of variables and their roles:

  • Deterministic variables: Deterministic regressors (e.g., dummy variables) can be included using deterministic(). Multiple deterministic variables may be supplied using +, for example deterministic(dummy1 + dummy2). These variables are treated as fixed components and are not associated with short-run or long-run dynamics.

  • Asymmetric variables: Asymmetric decompositions can be specified for short-run and/or long-run dynamics:

    • Sasymmetric: Specifies short-run asymmetric variables. For example, Sasymmetric(x1 + x2) applies short-run asymmetric decomposition to x1 and x2.

    • Lasymmetric: Specifies long-run asymmetric variables. For example, Lasymmetric(x1 + x3) applies long-run asymmetric decomposition to x1 and x3.

    • Asymmetric: Specifies variables that enter both short-run and long-run asymmetric decompositions. For example, Asymmetric(x1 + x3) applies asymmetric decomposition in both dynamics.

A trend term may be included to capture deterministic linear time trends by simply adding trend to the formula.

The formula also supports the use of . to represent all available regressors in the supplied data (excluding the dependent variable), following standard R formula conventions.

All of the operators Deterministic(), Sasymmetric(), Lasymmetric(), and Asymmetric() follow the same usage rules:

  • They can be freely combined within a single formula, for example:

      y ~ . +
        Asymmetric(z) +
        Lasymmetric(x2 + x3) +
        Sasymmetric(x3 + x4) +
        deterministic(dummy1 + dummy2) +
        trend
      
  • They must not be nested within one another. Valid usage: y ~ x + deterministic(dummy) + Asymmetric(z). Invalid usage (to be avoided): y ~ x + deterministic(Asymmetric(z)) or y ~ x + Asymmetric(deterministic(dummy)).

  • Where applicable, arguments are validated internally using match.arg(). Consequently, abbreviated inputs are accepted provided they uniquely identify a valid option. For example, if "asymmetric" is an admissible value, specifying "a" is sufficient. For clarity and reproducibility, however, full argument names are recommended.

These components may therefore be combined flexibly to construct a specification tailored to the empirical analysis.

case

An integer specifying the case number for the restricted ECM test. The available cases are:

  • case 1: No constant, no trend.

  • case 2: Restricted constant, no trend.

  • case 3: Unrestricted constant, no trend.

  • case 4: Unrestricted Constant, restricted trend.

  • case 5: Unrestricted constant, unrestricted trend.

The choice of case depends on the specific characteristics of the data and the model being tested. Each case corresponds to different assumptions about the presence of a constant term and a trend in the model, which can affect the interpretation of the test results and the conclusions about cointegration.

maxlag

An integer specifying the maximum number of lags to be considered for the model. The default value is 4. This parameter sets an upper limit on the lag length during the model estimation process.

details

The maxlag parameter is crucial for defining the maximum lag length that the model will evaluate when selecting the optimal lag structure based on the specified criterion. It controls the computational effort and helps prevent overfitting by restricting the search space for lag selection.

  • If the data has a short time horizon or is prone to overfitting, consider reducing maxlag.

  • If the data is expected to have long-term dependencies, increasing maxlag may be necessary to capture the relevant dynamics.

Setting an appropriate value for maxlag depends on the nature of your dataset and the context of the analysis:

  • For small datasets or quick tests, use smaller values (e.g., maxlag = 2).

  • For datasets with more observations or longer-term patterns, larger values (e.g., maxlag = 8) may be appropriate, though this increases computational time.

examples

Using the default maximum lag (4)

kardl(data, MyFormula, maxlag = 4)

Reducing the maximum lag to 2 for faster computation

kardl(data, MyFormula, maxlag = 2)

Increasing the maximum lag to 8 for datasets with longer dependencies

kardl(data, MyFormula, maxlag = 8)

mode

Specifies the mode of estimation and output control. This parameter determines how the function handles lag estimation and what kind of feedback or control is provided during the process. The available options are:

  • "quick" (default): Displays progress and messages in the console while the function estimates the optimal lag values. This mode is suitable for interactive use or for users who want to monitor the estimation process in real-time. It provides detailed feedback for debugging or observation but may use additional resources due to verbose output.

  • "grid" : Displays progress and messages in the console while the function estimates the optimal lag values. This mode is suitable for interactive use or for users who want to monitor the estimation process in real-time. It provides detailed feedback for debugging or observation but may use additional resources due to verbose output.

  • "grid_custom": Suppresses most or all console output, prioritizing faster execution and reduced resource usage on PCs or servers. This mode is recommended for high-performance scenarios, batch processing, or when the estimation process does not require user monitoring. Suitable for large-scale or repeated runs where output is unnecessary.

  • User-defined vector: A numeric vector of lag values specified by the user, allowing full customization of the lag structure used in model estimation. When a user-defined vector is provided (e.g., 'c(1, 2, 4, 5)'), the function skips the lag optimization process and directly uses the specified lags.

    - Users can define lag values directly as a numeric vector. For example: mode = c(1, 2, 4, 5) assigns lags of 1, 2, 4, and 5 to variables in the specified order. - Alternatively, lag values can be assigned to variables by name for clarity and control. For example: mode = c(CPI = 2, ER_POS = 3, ER_NEG = 1, PPI = 3) assigns lags to variables explicitly. - Ensure that the lags are correctly designated by verifying the result using kardl_model$properLag after estimation.

    Attention! -A function-based criterion or user-defined function can be specified for model selection, but this is only supported for mode = "grid_custom" and mode = "quick". The mode = "grid" option is restricted to predefined criteria (e.g., AIC or BIC). For more information on available criteria, see the modelCriterion function documentation. - When using a numeric vector, ensure the order of lag values matches the variables in your formula. - If using named vectors, double-check the variable names to avoid mismatches or unintended results. - This mode bypasses the automatic lag optimization and assumes the user-defined lags are correct.

The 'mode' parameter provides flexibility for different use cases: - Use '"grid"' mode for debugging or interactive use where progress visibility is important. - Use '"grid_custom"' mode to minimize overhead in computationally intensive tasks. - Specify a user-defined vector to customize the lag structure based on prior knowledge or analysis.

Selecting the appropriate mode can improve the efficiency and usability of the function depending on the user's requirements and the computational environment.

criterion

A string specifying the information criterion to be used for selecting the optimal lag structure. The available options are:

  • "AIC": Akaike Information Criterion (default). This criterion balances model fit and complexity, favoring models that explain the data well with fewer parameters.

  • "BIC": Bayesian Information Criterion. This criterion imposes a stronger penalty for model complexity than AIC, making it more conservative in selecting models with fewer parameters.

  • "AICc": Corrected Akaike Information Criterion. This is an adjusted version of AIC that accounts for small sample sizes, making it more suitable when the number of observations is limited relative to the number of parameters.

  • "HQ": Hannan-Quinn Information Criterion. This criterion provides a compromise between AIC and BIC, favoring models that balance fit and complexity without being overly conservative.

The criterion can be specified as a string (e.g., "AIC") or as a user-defined function that takes a fitted model object. Please visit the modelCriterion function documentation for more details on using custom criteria.

differentAsymLag

A logical value indicating whether to allow different lag lengths for positive and negative decompositions.

batch

A string specifying the batch processing configuration in the format "current_batch/total_batches". If a user utilize grid or grid_custom mode and want to split the lag search into multiple batches, this parameter can be used to define the current batch and the total number of batches. For example, "2/5" indicates that the current batch is the second out of a total of five batches. The default value is "1/1", meaning that the entire lag search is performed in a single batch.

...

Additional arguments that can be passed to the function. These arguments can be used to

Value

A list containing the results of the restricted ECM test, including:

Hypothesis testing

The restricted ECM test, also known as the PSS t Bound test, is a statistical test used to assess the presence of cointegration in a model. Cointegration refers to a long-term equilibrium relationship between two or more time series variables. The PSS t Bound test is based on the work of Pesaran, Shin, and Smith (2001) and is particularly useful for models with small sample sizes.

The null and alternative hypotheses for the restricted ECM test are as follows:

\mathbf{H_{0}:} \theta = 0

\mathbf{H_{1}:} \theta \neq 0

The null hypothesis (H_{0}) states that there is no cointegration in the model, meaning that the long-run relationship between the variables is not significant. The alternative hypothesis (H_{1}) suggests that there is cointegration, indicating a significant long-term relationship between the variables.

The test statistic is calculated as the t-statistic of the coefficient of the error correction term (\theta) in the ECM model. If the absolute value of the t-statistic exceeds the critical value from the PSS t Bound table, we reject the null hypothesis in favor of the alternative hypothesis, indicating that cointegration is present.

The cases for the restricted ECM Bound test are defined as follows:

The Error Correction Model (ECM) is specified as follows:

\begin{aligned} \Delta y_t &= \phi + \varphi t + \sum_{j=1}^{p} \gamma_j \Delta y_{t-j} + \sum_{i=1}^{k} \sum_{j=0}^{q_i} \beta_{ij} \Delta x_{i,t-j} + \theta (y_{t-1} - \sum_{i=1}^{k} \alpha_i x_{i,t-1} ) + e_t \end{aligned}

See Also

kardl pssf psst ecm narayan

Examples


suppressPackageStartupMessages(library(dplyr))
suppressPackageStartupMessages(library(tidyr))
suppressPackageStartupMessages(library(ggplot2))


 # Sample article: THE DYNAMICS OF EXCHANGE RATE PASS-THROUGH TO DOMESTIC PRICES IN TURKEY
 kardl_set(formula=CPI~ER+PPI+asym(ER)+deterministic(covid)+trend ,
           data=imf_example_data ,
           maxlag=3)

 ecm_model_grid<- ecm(mode = "grid")
 ecm_model_grid

 # Checking the cointegration test results using pesaran t test
 psst(ecm_model_grid)
 # Getting the details of psst result
 summary(psst(ecm_model_grid))

 # using the grid_custom mode for faster execution without console output
 ecm_model<- imf_example_data %>% ecm(mode = "grid_custom")
 ecm_model

 # Estimating the model with user-defined lag values
 ecm_model2<-ecm(mode = c( 2    ,  1    ,  1   ,   3 ))
 # Getting the results
 ecm_model2
 # Getting the summary of the results
 summary(ecm_model2)
 # OR using pipe operator
 imf_example_data %>% ecm(CPI~PPI+asym(ER) +trend,case=4) %>% summary()

 # For increasing the performance of finding the most fitted lag vector
 ecm(mode = "grid_custom")
 # Setting max lag instead of default value [4]
 ecm(maxlag = 2, mode = "grid_custom")
 # Using another criterion for finding the best lag
 ecm(criterion = "HQ", mode = "grid_custom")



 # For using different lag values for negative and positive decompositions of non-linear variables

 # setting the same lags for positive and negative decompositions.
 kardl_set(differentAsymLag = FALSE)

 diffAsymLags<-ecm( mode = "grid_custom")
 diffAsymLags$lagInfo$OptLag

 # setting the different lags for positive and negative decompositions
 sameAsymLags<-ecm(differentAsymLag = TRUE , mode = "grid_custom" )
 sameAsymLags$lagInfo$OptLag


 # Setting the preffixes and suffixes for non-linear variables
 kardl_reset()
 kardl_set(AsymPrefix = c("asyP_","asyN_"), AsymSuffix = c("_PP","_NN"))
 customizedNames<-ecm(imf_example_data, CPI~ER+PPI+asym(ER) )
 customizedNames

 # For having the lags plot

 #  ecm_model_grid[["LagCriteria"]] is a matrix, convert it to a data frame
 LagCriteria <- as.data.frame(ecm_model_grid$lagInfo$LagCriteria)
 # Rename columns for easier access and convert relevant columns to numeric
 colnames(LagCriteria) <- c("lag", "AIC", "BIC", "AICc", "HQ")
 LagCriteria <- LagCriteria %>%  mutate(across(c(AIC, BIC, HQ), as.numeric))

 # Pivot the data to a long format excluding AICc

 LagCriteria_long <- LagCriteria %>%  select(-AICc) %>%
   pivot_longer(cols = c(AIC, BIC, HQ), names_to = "Criteria", values_to = "Value")
 # Find the minimum value for each criterion
 min_values <- LagCriteria_long %>%  group_by(Criteria) %>%
   slice_min(order_by = Value) %>%  ungroup()

 # Create the ggplot with lines, highlight minimum values, and add labels
 ggplot(LagCriteria_long, aes(x = lag, y = Value, color = Criteria, group = Criteria)) +
   geom_line() +
   geom_point(data = min_values, aes(x = lag, y = Value), color = "red", size = 3, shape = 8) +
   geom_text(data = min_values, aes(x = lag, y = Value, label = lag),
             vjust = 1.5, color = "black", size = 3.5) +
   labs(title = "Lag Criteria Comparison", x = "Lag Configuration",  y = "Criteria Value") +
   theme_minimal() +
   theme(axis.text.x = element_text(angle = 45, hjust = 1))



IMF Example Data

Description

This is an example data set used for testing purposes. It contains monthly data on exchange rates, consumer price index, producer price index, and a dummy variable for the COVID-19 pandemic for Turkey from January 1985 to February 2024.

Usage

imf_example_data

Format

A data frame with 470 rows and 4 variables:

ER

Numeric. Exchange rate of Turkey.

CPI

Numeric. CPI of Turkey.

PPI

Numeric. PPI of Turkey.

covid

Integer.Covid19 dummy variable.

Details

These data obtained from imf.data package. The sample data is not updated and obtained by following codes:

install.packages("imf.data")
library("imf.data")
IFS <- load_datasets("IFS")

trdata<-IFS$get_series(freq = "M", ref_area = "TR", indicator = c("PCPI_IX","AIP_IX","ENDE_XDC_USD_RATE"),start_period = "1985-01",end_period = "2024-02")
PeriodRow<-trdata[,1]
trdata[,1]<-NULL
colnames(trdata)<-c("ER","CPI","PPI")
trdata<-log(as.data.frame(lapply(trdata, function(x) as.numeric(x))))
rownames(trdata)<-PeriodRow

Inserting covid dummy variable

trdata<-cbind(trdata,covid=0)
trdata[420:470,4]<-1

See Also

load_datasets

Examples

data(imf_example_data)
head(imf_example_data)

Estimate an ARDL or NARDL model with automatic lag selection

Description

This function estimates an Autoregressive Distributed Lag (ARDL) or Nonlinear ARDL (NARDL) model based on the provided data and model formula. It allows for flexible specification of variables, including deterministic terms, asymmetric variables, and trend components. The function also supports automatic lag selection using various information criteria.

Usage

kardl(
  data = NULL,
  formula = NULL,
  maxlag = NULL,
  mode = NULL,
  criterion = NULL,
  differentAsymLag = NULL,
  batch = NULL,
  ...
)

Arguments

data

The data of analysis

formula

A formula specifying the long-run model equation. This formula defines the relationships between the dependent variable and explanatory variables, including options for deterministic terms, asymmetric variables, and a trend component.

Example formula: y ~ x + z + Asymmetric(z) + Lasymmetric(x2 + x3) + Sasymmetric(x3 + x4) + deterministic(dummy1 + dummy2) + trend

Details

The formula allows flexible specification of variables and their roles:

  • Deterministic variables: Deterministic regressors (e.g., dummy variables) can be included using deterministic(). Multiple deterministic variables may be supplied using +, for example deterministic(dummy1 + dummy2). These variables are treated as fixed components and are not associated with short-run or long-run dynamics.

  • Asymmetric variables: Asymmetric decompositions can be specified for short-run and/or long-run dynamics:

    • Sasymmetric: Specifies short-run asymmetric variables. For example, Sasymmetric(x1 + x2) applies short-run asymmetric decomposition to x1 and x2.

    • Lasymmetric: Specifies long-run asymmetric variables. For example, Lasymmetric(x1 + x3) applies long-run asymmetric decomposition to x1 and x3.

    • Asymmetric: Specifies variables that enter both short-run and long-run asymmetric decompositions. For example, Asymmetric(x1 + x3) applies asymmetric decomposition in both dynamics.

A trend term may be included to capture deterministic linear time trends by simply adding trend to the formula.

The formula also supports the use of . to represent all available regressors in the supplied data (excluding the dependent variable), following standard R formula conventions.

All of the operators Deterministic(), Sasymmetric(), Lasymmetric(), and Asymmetric() follow the same usage rules:

  • They can be freely combined within a single formula, for example:

      y ~ . +
        Asymmetric(z) +
        Lasymmetric(x2 + x3) +
        Sasymmetric(x3 + x4) +
        deterministic(dummy1 + dummy2) +
        trend
      
  • They must not be nested within one another. Valid usage: y ~ x + deterministic(dummy) + Asymmetric(z). Invalid usage (to be avoided): y ~ x + deterministic(Asymmetric(z)) or y ~ x + Asymmetric(deterministic(dummy)).

  • Where applicable, arguments are validated internally using match.arg(). Consequently, abbreviated inputs are accepted provided they uniquely identify a valid option. For example, if "asymmetric" is an admissible value, specifying "a" is sufficient. For clarity and reproducibility, however, full argument names are recommended.

These components may therefore be combined flexibly to construct a specification tailored to the empirical analysis.

maxlag

An integer specifying the maximum number of lags to be considered for the model. The default value is 4. This parameter sets an upper limit on the lag length during the model estimation process.

details

The maxlag parameter is crucial for defining the maximum lag length that the model will evaluate when selecting the optimal lag structure based on the specified criterion. It controls the computational effort and helps prevent overfitting by restricting the search space for lag selection.

  • If the data has a short time horizon or is prone to overfitting, consider reducing maxlag.

  • If the data is expected to have long-term dependencies, increasing maxlag may be necessary to capture the relevant dynamics.

Setting an appropriate value for maxlag depends on the nature of your dataset and the context of the analysis:

  • For small datasets or quick tests, use smaller values (e.g., maxlag = 2).

  • For datasets with more observations or longer-term patterns, larger values (e.g., maxlag = 8) may be appropriate, though this increases computational time.

examples

Using the default maximum lag (4)

kardl(data, MyFormula, maxlag = 4)

Reducing the maximum lag to 2 for faster computation

kardl(data, MyFormula, maxlag = 2)

Increasing the maximum lag to 8 for datasets with longer dependencies

kardl(data, MyFormula, maxlag = 8)

mode

Specifies the mode of estimation and output control. This parameter determines how the function handles lag estimation and what kind of feedback or control is provided during the process. The available options are:

  • "quick" (default): Displays progress and messages in the console while the function estimates the optimal lag values. This mode is suitable for interactive use or for users who want to monitor the estimation process in real-time. It provides detailed feedback for debugging or observation but may use additional resources due to verbose output.

  • "grid" : Displays progress and messages in the console while the function estimates the optimal lag values. This mode is suitable for interactive use or for users who want to monitor the estimation process in real-time. It provides detailed feedback for debugging or observation but may use additional resources due to verbose output.

  • "grid_custom": Suppresses most or all console output, prioritizing faster execution and reduced resource usage on PCs or servers. This mode is recommended for high-performance scenarios, batch processing, or when the estimation process does not require user monitoring. Suitable for large-scale or repeated runs where output is unnecessary.

  • User-defined vector: A numeric vector of lag values specified by the user, allowing full customization of the lag structure used in model estimation. When a user-defined vector is provided (e.g., 'c(1, 2, 4, 5)'), the function skips the lag optimization process and directly uses the specified lags.

    - Users can define lag values directly as a numeric vector. For example: mode = c(1, 2, 4, 5) assigns lags of 1, 2, 4, and 5 to variables in the specified order. - Alternatively, lag values can be assigned to variables by name for clarity and control. For example: mode = c(CPI = 2, ER_POS = 3, ER_NEG = 1, PPI = 3) assigns lags to variables explicitly. - Ensure that the lags are correctly designated by verifying the result using kardl_model$properLag after estimation.

    Attention! -A function-based criterion or user-defined function can be specified for model selection, but this is only supported for mode = "grid_custom" and mode = "quick". The mode = "grid" option is restricted to predefined criteria (e.g., AIC or BIC). For more information on available criteria, see the modelCriterion function documentation. - When using a numeric vector, ensure the order of lag values matches the variables in your formula. - If using named vectors, double-check the variable names to avoid mismatches or unintended results. - This mode bypasses the automatic lag optimization and assumes the user-defined lags are correct.

The 'mode' parameter provides flexibility for different use cases: - Use '"grid"' mode for debugging or interactive use where progress visibility is important. - Use '"grid_custom"' mode to minimize overhead in computationally intensive tasks. - Specify a user-defined vector to customize the lag structure based on prior knowledge or analysis.

Selecting the appropriate mode can improve the efficiency and usability of the function depending on the user's requirements and the computational environment.

criterion

A string specifying the information criterion to be used for selecting the optimal lag structure. The available options are:

  • "AIC": Akaike Information Criterion (default). This criterion balances model fit and complexity, favoring models that explain the data well with fewer parameters.

  • "BIC": Bayesian Information Criterion. This criterion imposes a stronger penalty for model complexity than AIC, making it more conservative in selecting models with fewer parameters.

  • "AICc": Corrected Akaike Information Criterion. This is an adjusted version of AIC that accounts for small sample sizes, making it more suitable when the number of observations is limited relative to the number of parameters.

  • "HQ": Hannan-Quinn Information Criterion. This criterion provides a compromise between AIC and BIC, favoring models that balance fit and complexity without being overly conservative.

The criterion can be specified as a string (e.g., "AIC") or as a user-defined function that takes a fitted model object. Please visit the modelCriterion function documentation for more details on using custom criteria.

differentAsymLag

A logical value indicating whether to allow different lag lengths for positive and negative decompositions.

batch

A string specifying the batch processing configuration in the format "current_batch/total_batches". If a user utilize grid or grid_custom mode and want to split the lag search into multiple batches, this parameter can be used to define the current batch and the total number of batches. For example, "2/5" indicates that the current batch is the second out of a total of five batches. The default value is "1/1", meaning that the entire lag search is performed in a single batch.

...

Additional arguments that can be passed to the function. These arguments can be used to

Details

Note: All arguments of this function can be set using kardl_set function.

Value

An object of class kardl_lm containing the estimated ARDL or NARDL model. The object includes the following components:

See Also

ecm, kardl_set, kardl_get, kardl_reset, modelCriterion

Examples


suppressPackageStartupMessages(library(dplyr))
suppressPackageStartupMessages(library(tidyr))
suppressPackageStartupMessages(library(ggplot2))

# Sample article: THE DYNAMICS OF EXCHANGE RATE PASS-THROUGH TO DOMESTIC PRICES IN TURKEY

kardl_set(formula =CPI~ER+PPI+Asymmetr(ER)+deterministic(covid)+trend ,
          data=imf_example_data,
          maxlag=2
) # setting the default values of the kardl function



kardl_model_grid<-kardl( mode = "grid")
kardl_model_grid

kardl_model<- imf_example_data %>% kardl(mode = "grid_custom")
kardl_model
kardl_model2<-kardl(mode = c( 2    ,  1    ,  1   ,   3 ))

# Getting the results
kardl_model2

# Getting the summary of the results
summary(kardl_model)

  # OR
  imf_example_data %>% kardl(formula=CPI~PPI+Asymmetric(ER)) %>% summary()

# using . in the formula means that all variables in the data will be used

kardl(formula=CPI~.+deterministic(covid),mode = "grid")

# Setting max lag instead of default value [4]
kardl(imf_example_data,
      CPI~ER+PPI+Lasymmetric(ER),
      maxlag = 3, mode = "grid_custom")

# Using another criterion for finding the best lag#'
kardl_set(criterion = "HQ") # setting the criterion to HQ
kardl( mode = "grid_custom")

# using default values of lags
kardl( mode=c(1,2,3,0))

# For using different lag values for negative and positive decompositions of non-linear variables
# setting the same lags for positive and negative decompositions.

same<-kardl(formula=CPI~Asymmetric(ER),maxlag=2, mode = "grid_custom",differentAsymLag = FALSE)
dif<-kardl(formula=CPI~Sasymmetric(ER),maxlag=2, mode = "grid_custom",differentAsymLag = TRUE)

same$lagInfo$OptLag
dif$lagInfo$OptLag

# Setting the preffixes and suffixes for non-linear variables
kardl_set(AsymPrefix = c("asyP_","asyN_"), AsymSuffix = c("_PP","_NN"))
kardl()

# For having the lags plot

#  kardl_model_grid[["LagCriteria"]] is a matrix, convert it to a data frame
LagCriteria <- as.data.frame(kardl_model_grid$lagInfo$LagCriteria)
# Rename columns for easier access and convert relevant columns to numeric
colnames(LagCriteria) <- c("lag", "AIC", "BIC", "AICc", "HQ")

LagCriteria <- LagCriteria %>%  mutate(across(c(AIC, BIC, HQ), as.numeric))

# Pivot the data to a long format excluding AICc

 LagCriteria_long <- LagCriteria %>%
  select(-AICc) %>%
  pivot_longer(cols = c(AIC, BIC, HQ), names_to = "Criteria", values_to = "Value")

 # Find the minimum value for each criterion
 min_values <- LagCriteria_long %>%  group_by(Criteria) %>%
  slice_min(order_by = Value) %>%  ungroup()

 # Create the ggplot with lines, highlight minimum values, and add labels
 ggplot2::ggplot(LagCriteria_long, aes(x = lag, y = Value, color = Criteria, group = Criteria)) +
  geom_line() +
  geom_point(data = min_values, aes(x = lag, y = Value), color = "red", size = 3, shape = 8) +
  geom_text(data = min_values, aes(x = lag, y = Value, label = lag),
            vjust = 1.5, color = "black", size = 3.5) +
  labs(title = "Lag Criteria Comparison ", x = "Lag Configuration",  y = "Criteria Value") +
  theme_minimal() +
  theme(axis.text.x = element_text(angle = 45, hjust = 1))


Function to Get KARDL Package Options

Description

This function retrieves the current settings of the kardl package. Users can specify option names to get their values or call the function without arguments to retrieve all current settings.

Usage

kardl_get(...)

Arguments

...

Option names to retrieve. If no arguments are provided, all options will be returned.

Value

If no arguments are provided, returns all options as a list. If specific option names are provided, returns their values.

See Also

kardl_set, kardl_reset

Examples


# Get all options
kardl_get()
# Get specific options
kardl_get("maxlag", "mode")

# Note: In interactive use, avoid calling kardl_get() directly to prevent cluttering the console.

a<-kardl_get()
a$AsymSuffix


Calculate long-run multipliers from a KARDL model

Description

This function calculates the long-run parameters of a KARDL model estimated using the kardl function. The long-run parameters are calculated by dividing the negative of the coefficients of the independent variables by the coefficient of the dependent variable. If an intercept is included in the model, it is also standardized by dividing it by the negative of the long-run parameter of the dependent variable.

Usage

kardl_longrun(model)

Arguments

model

An object of class kardl estimated using the kardl function.

Details

The function also calculates the standard errors of the long-run multipliers using the delta method, which accounts for the covariance between the coefficients. The fitted values and residuals of the long-run model are calculated based on the original data and the long-run multipliers.

The function returns an object of class kardl_long_run, which contains the long-run multipliers, their standard errors, t-statistics, p-values, fitted values, residuals, and other relevant information for further analysis and diagnostics.

Note that the fitted values and residuals from the long-run model are not centered (i.e., they do not have a mean of zero) by design, which means that diagnostic plots and residual-based tests may not be valid for this model. The primary focus of this function is on the estimation of the long-run multipliers and their associated statistics.

LongRunMultiplier_i = -\frac{\beta_i}{\beta_{dep}}

. t-values and p-values are calculated using the standard errors obtained from the delta method, which accounts for the covariance between the coefficients.

Value

An object of class kardl_long_run, which is a list containing:

See Also

kardl, pssf, psst

Examples

kardl_model<-kardl(imf_example_data,
                   CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                   mode=c(1,2,3,0))
long<-kardl_longrun(kardl_model)

# Calculate the long-run multipliers
long
# Details of the long-run multipliers
summary(long)


# Using magrittr
library(magrittr)

imf_example_data %>% kardl(CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                           mode=c(1,2,3,0)) %>% kardl_longrun() %>% summary()


Function to Reset KARDL Package Options to Default Values

Description

This function resets all options in the kardl package to their default values.

Usage

kardl_reset(. = FALSE)

Arguments

.

If provided and not 'FALSE', the function will return this value after resetting the settings. If not provided or set to 'FALSE', it will return the current settings.

Value

If resetting options, returns the provided value (if any) or invisibly returns the current settings as a list.

If resetting options, returns the provided value (if any) or invisibly returns the current settings as a list.

See Also

kardl_set, kardl_get

Examples

# Set some options
kardl_set(criterion = "BIC", differentAsymLag = TRUE)

# Reset to default options
kardl_get("criterion")  # Check current settings
kardl_reset()
kardl_get("criterion")  # Check settings after reset

library(magrittr)
 MyFormula<-CPI~ER+PPI+asym(ER)+deterministic(covid)+trend
imf_example_data %>%
  kardl_set(LongCoef= "K1{lag}w1{varName}",differentAsymLag= FALSE ) %>%  kardl(MyFormula ) %>%
    kardl_reset()
kardl_get()

imf_example_data %>%
  kardl_reset() %>%
    kardl_set(LongCoef= "K2{lag}w2{varName}",differentAsymLag=FALSE ) %>%  kardl(MyFormula)

kardl_get(c("LongCoef","differentAsymLag","ShortCoef","batch"))


Function to Set KARDL Package Options

Description

This function allows users to set options for the kardl package. Users can specify named arguments to set options or call the function without arguments to retrieve all current settings.

Usage

kardl_set(. = FALSE, ...)

Arguments

.

If provided and not 'FALSE', the function will return this value after setting the options. If not provided or set to 'FALSE', it will return the current settings.

...

Named arguments corresponding to the options to be set. Valid option names include those defined in the kardl package settings.

Value

If no arguments are provided, returns all options as a list. If named arguments are provided, sets those options and returns the updated list.

If no arguments are provided, returns all options as a list. If named arguments are provided, sets those options and returns the updated list.

See Also

kardl_get, kardl_reset

Examples

# Set options
kardl_set(maxlag = 5, mode = "grid")
# Get all options
kardl_get()
# Get specific options
kardl_get("maxlag", "mode")

# Note: In interactive use, avoid calling kardl_get() directly to prevent cluttering the console.


kardl_get()


# Example with magrittr pipe
library(magrittr)
# Set custom coefficient naming conventions

MyFormula<-CPI~ER+PPI+asym(ER)+deterministic(covid)+trend
kardl_set(ShortCoef = "L___{lag}.d.{varName}", formula = MyFormula, data = imf_example_data)
imf_example_data %>%   kardl(MyFormula)

kardl_reset()
kardl_get()

imf_example_data %>%  kardl_set(LongCoef= "LK{lag}_{varName}",ShortCoef = "D{lag}.d.{varName}") %>%
kardl(MyFormula)
kardl_get(c("LongCoef","ShortCoef"))


Merge two lists, giving precedence to the first list for overlapping names

Description

The first list of values takes precedence. When both lists have items with the same names, the values from the first list will be applied. In merging the two lists, priority is given to the left list, so if there are overlapping items, the corresponding value from the left list will be used in the merged result.

Usage

lmerge(first, second, ...)

Arguments

first

The first list

second

The second list

...

Additional lists to merge

Value

A merged list with unique names, prioritizing values from the first list in case of name conflicts.

See Also

append

Examples


a<-list("a"="first a","b"="second a","c"=list("w"=12,"k"=c(1,3,6)))
b<-list("a"="first b","b"="second b","d"=14,"e"=45)
theResult<- lmerge(a,b)
unlist(theResult)

# for right merge
lmerge(b,a)

# Unisted return
theResult<- lmerge(a,b,c("v1"=11,22,3,"v5"=5))
theResult

m2<-list("m1"="kk2","m1.2.3"=list("m1.1.1"=333,"m.1.4"=918,"m.1.5"=982,"m.1.6"=981,"m.1.7"=928))
m3<-list("m1"="kk23","m2.3"=2233,"m1.2.4"=list("m1.1.1"=333444,"m.1.5"=982,"m.1.6"=91,"m.1.7"=928))
a<-c(32,34,542,"k"=35)
b<-c(65,"k"=34)

h1<-lmerge(a, m2)
unlist( h1)
h2<-lmerge(a,b,m2,m3,list("m1.1"=4))
unlist(h2)

Model Selection Criterion

Description

Computes a model selection criterion (AIC, BIC, AICc, or HQ) or applies a user-defined function to evaluate a statistical model.

Usage

modelCriterion(estModel, cr, ...)

Arguments

estModel

An object containing the fitted model. The object should include at least:

  • estModel$model – the actual fitted model object (e.g., from lm, glm).

  • k – the number of estimated parameters.

  • n – the sample size.

cr

A character string specifying the criterion to compute. Options are "AIC", "BIC", "AICc", and "HQ". Alternatively, a user-defined function can be provided.

...

Additional arguments passed to the user-defined criterion function if cr is a function.

Details

This function returns model selection criteria used to compare the quality of different models. All criteria are defined such that lower values indicate better models (i.e., the goal is minimization).

If you wish to compare models using a maximization approach (e.g., log-likelihood), you can multiply the result by -1.

Note: The predefined string options (e.g., "AIC") are not the same as the built-in R functions AIC() or BIC(). In particular, the values returned by this function are adjusted by dividing by the sample size n (i.e., normalized AIC/BIC), which makes it more comparable across datasets of different sizes.

The function returns:

where:

If cr is a function, it is called with the fitted model and any additional arguments passed through ....

Value

A numeric value representing the selected criterion, normalized by the sample size if one of the predefined options is used.

See Also

kardl

Examples


# Example usage of modelCriterion function with a linear model
mylm<- lm(mpg ~ wt + hp, data = mtcars)
modelCriterion(mylm, AIC )
modelCriterion(mylm, "BIC" )
mm<-AIC(mylm)
 class(mm) == class(modelCriterion(mylm, "AIC"))

 # Example usage of modelCriterion function with a kardl model
 kardl_model <- kardl(imf_example_data,
                      CPI ~ ER + PPI + asym(ER) + deterministic(covid) + trend,
                      mode = c(1, 2, 3, 0))
 modelCriterion(kardl_model, "AIC")
 modelCriterion(kardl_model, AIC)
 AIC(kardl_model)
 modelCriterion(kardl_model, "BIC")

 # Using a custom criterion function
 my_cr_fun <- function(mod, ...) { AIC(mod) / length(mod$model[[1]]) }
 modelCriterion(kardl_model, my_cr_fun)


Calculating multipliers of estimated model

Description

By running m<-mplier(kardl_model), the object m$mpsi contains the final multiplier values, m$omega holds the omega values, and m$lambda returns the Lambda values of the model.

Usage

mplier(kmodel, horizon = 80, minProb = 0)

Arguments

kmodel

The model produced by kardl function. This is the model object from which the dynamic multipliers will be calculated. The function expects an object of class kardl_lm, which is the standard output class for linear models estimated using the kardl package. If the input object does not inherit from this class, the function will throw an error, ensuring that it operates on compatible model objects. The function extracts necessary information from the model, such as coefficients, lag structure, and variable names, to compute the dynamic multipliers. It calculates the short-run coefficients, Lambda values, and omega values based on the model's parameters and lag structure. The output includes a matrix of dynamic multipliers (mpsi), which can be used for further analysis or visualization.

horizon

The horizon over which multipliers will be computed. This parameter defines the time frame for the analysis, allowing users to specify how many periods into the future they want to calculate the multipliers for. For example, setting horizon = 40 will compute the multipliers for 40 periods ahead. The function uses this horizon to determine the number of rows in the output matrix of multipliers and to structure the calculations accordingly.

minProb

The minimum p-value threshold for including coefficients in the multipliers calculation. Coefficients with p-values above this threshold will be set to zero in the multipliers calculation. This parameter allows users to control the inclusion of coefficients based on their statistical significance. Setting a threshold can help focus the analysis on more relevant variables, but it may also exclude potentially important effects if set too stringently. The default value is 0, which means that all coefficients will be included regardless of their p-values.

Details

The mplier function computes dynamic multipliers based on the coefficients and lag structure of a model estimated using the kardl package. The function extracts necessary information from the model, such as coefficients, lag structure, and variable names, to compute the dynamic multipliers. It calculates the short-run coefficients, Lambda values, and omega values based on the model's parameters and lag structure. The output includes a matrix of dynamic multipliers (mpsi), which can be used for further analysis or visualization. The dynamic multipliers provide insight into how changes in the independent variables affect the dependent variable over time, allowing for a deeper understanding of the relationships captured by the model. The function also allows users to set a minimum p-value threshold for including coefficients in the calculation, providing flexibility in focusing on statistically significant effects. The function is designed to work specifically with models estimated using the kardl package, and it ensures that the input model is of the correct class before proceeding with the calculations. The resulting multipliers can be used for various applications, including forecasting, policy analysis, and understanding the dynamic relationships between variables in time series data.

{{\psi_{h}^{+} = {\sum_{i = 0}^{h}\frac{\partial y_{t + i}}{\partial x_{t}^{+}}}} ; {\psi_{h}^{-} = {\sum_{i = 0}^{h}\frac{\partial y_{t + i}}{\partial x_{t}^{-}}}}}

The above equation represents the cumulative dynamic multipliers for positive and negative changes in an independent variable (x) on the dependent variable (y) over a specified horizon (h). The multipliers are calculated as the sum of the partial derivatives of the dependent variable with respect to the positive and negative changes in the independent variable across the time horizon. These multipliers provide insight into how changes in the independent variable affect the dependent variable over time, allowing for a deeper understanding of the relationships captured by the model.

Value

A list containing the following elements:

See Also

bootstrap

Examples


# This example demonstrates how to use the mplier function to calculate dynamic multipliers
# from a model estimated using the kardl package. The example includes fitting a model with
# the kardl function, calculating the multipliers, and visualizing the results using both
# base R plotting and ggplot2.

 kardl_model<-kardl(imf_example_data, CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
 mode=c(1,2,3,0))
 m<-mplier(kardl_model,40)
 head(m$mpsi)
 head(m$omega)
 head(m$lambda)

 # Displaying the summary of the multipliers object
 summary(m)

 # Visualize the dynamic multipliers

plot(m)

 # For plotting specific variables, you can specify them in the plot function. For example,
 # to plot the multipliers for the variable "ER":

 plot(m, variable = "ER")



Narayan Test

Description

This function performs the Narayan test, which is designed to assess cointegration using critical values specifically tailored for small sample sizes. Unlike traditional cointegration tests that may rely on asymptotic distributions, the Narayan test adjusts for the limitations of small samples, providing more accurate results in such contexts. This makes the test particularly useful for studies with fewer observations, as it accounts for sample size constraints when determining the presence of a long-term equilibrium relationship between variables.

Usage

narayan(kmodel, case = 3, signif_level = "auto")

Arguments

kmodel

The kardl obejct

case

Numeric or character. Specifies the case of the test to be used in the function. Acceptable values are 1, 2, 3, 4, 5, and "auto". If "auto" is chosen, the function determines the case automatically based on the model's characteristics. Invalid values will result in an error.

  • 1: No intercept and no trend. This case is not supported by the Narayan test.

  • 2: Restricted intercept and no trend.

  • 3: Unrestricted intercept and no trend.

  • 4: Unrestricted intercept and restricted trend.

  • 5: Unrestricted intercept and unrestricted trend.

signif_level

Character or numeric. Specifies the significance level to be used in the function.

Acceptable values are "auto", "0.10", "0.1", "0.05", "0.025", and "0.01". If a numeric value is provided, it will be converted to a character string. If "auto" is chosen, the function determines the significance level automatically. Invalid values will result in an error.

Value

A list with class "htest" containing the following components:

Hypothesis testing

The null hypothesis (H0) of the F Bound test is that there is no cointegration among the variables in the model. In other words, it tests whether the long-term relationship between the variables is statistically significant. If the calculated F-statistic exceeds the upper critical value, we reject the null hypothesis and conclude that there is cointegration. Conversely, if the F-statistic falls below the lower critical value, we fail to reject the null hypothesis, indicating no evidence of cointegration. If the F-statistic lies between the two critical values, the result is inconclusive.

\Delta {y}_t = \psi + \varphi t + \eta _0 {y}_{t-1} + \sum_{i=1}^{k} { \eta _i {x}_{i,t-1} } + \sum_{j=1}^{p} { \gamma_{j} \Delta {y}_{t-j} }+ \sum_{i=1}^{k} {\sum_{j=0}^{q_i} { \beta_{ij} \Delta {x}_{i,t-j} } }+ e_t

Cases 1, 3, 5:

\mathbf{H_{0}:} \eta_0 = \eta_1 = \dots = \eta_k = 0

\mathbf{H_{1}:} \eta_{0} \neq \eta_{1} \neq \dots \neq \eta_{k} \neq 0

Case 2:

\mathbf{H_{0}:} \eta_0 = \eta_1 = \dots = \eta_k = \psi = 0

\mathbf{H_{1}:} \eta_{0} \neq \eta_{1} \neq \dots \neq \eta_{k} \neq \psi \neq 0

Case 4:

\mathbf{H_{0}:} \eta_0 = \eta_1 = \dots = \eta_k = \varphi = 0

\mathbf{H_{1}:} \eta_{0} \neq \eta_{1} \neq \dots \neq \eta_{k} \neq \varphi \neq 0

References

Narayan, P. K. (2005). The saving and investment nexus for China: evidence from cointegration tests. Applied economics, 37(17), 1979-1990.

See Also

pssf psst ecm

Examples

kardl_model<-kardl(imf_example_data,
                   CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                   mode=c(1,2,3,0))
my_test<-narayan(kardl_model)
# Getting the results of the test.
my_test
# Getting details of the test.
my_summary<-summary(my_test)
my_summary

# Getting the critical values of the test.
my_summary$crit_vals




# Using magrittr :

library(magrittr)
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                           mode=c(1,2,3,0)) %>% narayan()

# Getting details of the test results using magrittr:
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                           mode=c(1,2,3,0)) %>% narayan() %>% summary()



Parse a formula to detect specific variable patterns

Description

The parseFormula() function analyzes a given formula to identify and extract variables that match specified patterns. It is particularly useful for isolating variables enclosed within certain functions or constructs in the formula, such as asym(), det(), or any user-defined patterns.

Usage

parse_formula_vars(formula)

Arguments

formula

The initial formula for the model, typically specified using R's formula syntax (e.g., y ~ x + f(x1 + x2)).

Value

A list containing:

See Also

formula and gregexpr

Examples


# Parse formulas containing various collection types like ()
formula_ <- y ~ x +det(s -gg- d) + asymS(d2 -rr+ s)-mm(y1+y2+y3)+asym(k1+k2+k3)+trend-huseyin
# Extract variables
parse_formula_vars(formula_)



Pesaran et al. (2001) Bounds F-Test for KARDL Models

Description

This function performs the Pesaran, Shin, and Smith (PSS) F Bound test to assess the presence of a long-term relationship (cointegration) between variables in the context of an autoregressive distributed lag (ARDL) model. The PSS F Bound test examines the joint significance of lagged levels of the variables in the model. It provides critical values for both the upper and lower bounds, which help determine whether the variables are cointegrated. If the calculated F-statistic falls outside these bounds, it indicates the existence of a long-term equilibrium relationship. This test is particularly useful when the underlying data includes a mix of stationary and non-stationary variables.

Usage

pssf(kmodel, case = 3, signif_level = "auto")

Arguments

kmodel

A fitted KARDL model object of class 'kardl_lm' created using the kardl function.

case

Numeric or character. Specifies the case of the test to be used in the function. Acceptable values are 1, 2, 3, 4, 5, and "auto". If "auto" is chosen, the function determines the case automatically based on the model's characteristics. Invalid values will result in an error.

  • 1: No intercept and no trend

  • 2: Restricted intercept and no trend

  • 3: Unrestricted intercept and no trend

  • 4: Unrestricted intercept and restricted trend

  • 5: Unrestricted intercept and unrestricted trend

signif_level

Character or numeric. Specifies the significance level to be used in the function. Acceptable values are "auto", "0.10", "0.1", "0.05", "0.025", and "0.01". If a numeric value is provided, it will be converted to a character string. If "auto" is chosen, the function determines the significance level automatically. Invalid values will result in an error.

Value

A list with class "htest" containing the following components:

Hypothesis testing

The null hypothesis (H0) of the F Bound test is that there is no cointegration among the variables in the model. In other words, it tests whether the long-term relationship between the variables is statistically significant. If the calculated F-statistic exceeds the upper critical value, we reject the null hypothesis and conclude that there is cointegration. Conversely, if the F-statistic falls below the lower critical value, we fail to reject the null hypothesis, indicating no evidence of cointegration. If the F-statistic lies between the two critical values, the result is inconclusive.

\Delta {y}_t = \psi + \varphi t + \eta _0 {y}_{t-1} + \sum_{i=1}^{k} { \eta _i {x}_{i,t-1} } + \sum_{j=1}^{p} { \gamma_{j} \Delta {y}_{t-j} }+ \sum_{i=1}^{k} {\sum_{j=0}^{q_i} { \beta_{ij} \Delta {x}_{i,t-j} } }+ e_t

Cases 1, 3, 5:

\mathbf{H_{0}:} \eta_0 = \eta_1 = \dots = \eta_k = 0

\mathbf{H_{1}:} \eta_{0} \neq \eta_{1} \neq \dots \neq \eta_{k} \neq 0

Case 2:

\mathbf{H_{0}:} \eta_0 = \eta_1 = \dots = \eta_k = \psi = 0

\mathbf{H_{1}:} \eta_{0} \neq \eta_{1} \neq \dots \neq \eta_{k} \neq \psi \neq 0

Case 4:

\mathbf{H_{0}:} \eta_0 = \eta_1 = \dots = \eta_k = \varphi = 0

\mathbf{H_{1}:} \eta_{0} \neq \eta_{1} \neq \dots \neq \eta_{k} \neq \varphi \neq 0

References

Pesaran, M. H., Shin, Y. and Smith, R. (2001), "Bounds Testing Approaches to the Analysis of Level Relationship", Journal of Applied Econometrics, 16(3), 289-326.

See Also

psst ecm narayan

Examples

kardl_model<-kardl(imf_example_data,
                   CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                   mode=c(1,2,3,0))
my_pssF<-pssf(kardl_model)
# Getting the results of the test.
my_pssF
# Getting details of the test.
my_summary<-summary(my_pssF)
my_summary

# Getting the critical values of the test.
my_summary$crit_vals




# Using magrittr :

library(magrittr)
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                           mode=c(1,2,3,0)) %>% pssf()

# Getting details of the test results using magrittr:
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                           mode=c(1,2,3,0)) %>% pssf() %>% summary()



PSS t Bound Test

Description

This function performs the Pesaran t Bound test

Usage

psst(kmodel, case = 3, signif_level = "auto")

Arguments

kmodel

A fitted KARDL model object of class 'kardl_lm' created using the kardl function.

case

Numeric or character. Specifies the case of the test to be used in the function. Acceptable values are 1, 2, 3, 4, 5, and "auto". If "auto" is chosen, the function determines the case automatically based on the model's characteristics. Invalid values will result in an error.

  • 1: No intercept and no trend

  • 2: Restricted intercept and no trend

  • 3: Unrestricted intercept and no trend

  • 4: Unrestricted intercept and restricted trend

  • 5: Unrestricted intercept and unrestricted trend

signif_level

Character or numeric. Specifies the significance level to be used in the function. Acceptable values are "auto", "0.10", "0.1", "0.05", "0.025", and "0.01". If a numeric value is provided, it will be converted to a character string. If "auto" is chosen, the function determines the significance level automatically. Invalid values will result in an error.

Details

This function performs the Pesaran, Shin, and Smith (PSS) t Bound test, which is used to detect the existence of a long-term relationship (cointegration) between variables in an autoregressive distributed lag (ARDL) model. The t Bound test specifically focuses on the significance of the coefficient of the lagged dependent variable, helping to assess whether the variable reverts to its long-term equilibrium after short-term deviations. The test provides critical values for both upper and lower bounds. If the t-statistic falls within the appropriate range, it confirms the presence of cointegration. This test is particularly useful when working with datasets containing both stationary and non-stationary variables.

Value

The function returns an object of class "htest" containing the following components:

Hypothesis testing

The PSS t Bound test evaluates the null hypothesis that the long-run coefficients of the model are equal to zero against the alternative hypothesis that at least one of them is non-zero. The test is conducted under different cases, depending on the model specification.

\Delta {y}_t = \psi + \varphi t + \eta _0 {y}_{t-1} + \sum_{i=1}^{k} { \eta _i {x}_{i,t-1} } + \sum_{j=1}^{p} { \gamma_{j} \Delta {y}_{t-j} }+ \sum_{i=1}^{k} {\sum_{j=0}^{q_i} { \beta_{ij} \Delta {x}_{i,t-j} } }+ e_t

\mathbf{H_{0}:} \eta_0 = 0

\mathbf{H_{1}:} \eta_{0} \neq 0

References

Pesaran, M. H., Shin, Y. and Smith, R. (2001), "Bounds Testing Approaches to the Analysis of Level Relationship", Journal of Applied Econometrics, 16(3), 289-326.

See Also

pssf ecm narayan

Examples


kardl_model<-kardl(imf_example_data,
                   CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                   mode=c(1,2,3,0))
my_test<-psst(kardl_model)
# Getting the results of the test.
my_test
# Getting details of the test.
my_summary<-summary(my_test)
my_summary

# Getting the critical values of the test.
my_summary$crit_vals




# Using magrittr :

library(magrittr)
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                           mode=c(1,2,3,0)) %>% psst()

# Getting details of the test results using magrittr:
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER)+deterministic(covid)+trend,
                           mode=c(1,2,3,0)) %>% psst() %>% summary()



Symmetry Test for non-linear KARDL Models

Description

The symmetry test is a statistical procedure used to assess the presence of symmetry in the relationship between variables in a model. It is particularly useful in econometric analysis, where it helps to identify whether the effects of changes in one variable on another are symmetric or asymmetric. The test involves estimating a model that includes both positive and negative components of the variables and then performing a Wald test to determine if the coefficients of these components are significantly different from each other. If the test indicates significant differences, it suggests that the relationship is asymmetric, meaning that the impact of increases and decreases in the variables differs. This test returns results for both long-run and short-run variables in a KARDL model. Where applicable, it provides the Wald test statistics, p-values, degrees of freedom, sum of squares, and mean squares for each variable tested. If the null hypothesis of symmetry is rejected, it indicates that the effects of positive and negative changes in the variable are significantly different, suggesting an asymmetric relationship.

The non-linear model with one asymmetric variables is specified as follows:

\Delta{y_{t}} = \psi + \eta_{0}y_{t - 1} + \eta^{+}_{1} x^{+}_{t - 1}+ \eta^{-}_{1} x^{-}_{t - 1} + \sum_{j = 1}^{p}{\gamma_{j}\Delta y_{t - j}} + \sum_{j = 0}^{q}{\beta^{+}_{j}\Delta x^{+}_{t - j}} + \sum_{j = 0}^{m}{\beta^{-}_{j}\Delta x^{-}_{t - j}} + e_{t}

This function performs the symmetry test both for long-run and short-run variables in a kardl model. It uses the nlWaldtest function from the nlWaldTest package for long-run variables and the linearHypothesis function from the car package for short-run variables. The hypotheses for the long-run variables are:

H_{0}: -\frac{\eta^{+}_{1}}{\eta_{0}} = -\frac{\eta^{-}_{1}}{\eta_{0}}

H_{1}: -\frac{\eta^{+}_{1}}{\eta_{0}} \neq -\frac{\eta^{-}_{1}}{\eta_{0}}

The hypotheses for the short-run variables are:

H_{0}: \sum_{j = 0}^{q}{\beta^{+}_{j}} = \sum_{j = 0}^{m}{\beta^{-}_{j}}

H_{1}: \sum_{j = 0}^{q}{\beta^{+}_{j}} \neq \sum_{j = 0}^{m}{\beta^{-}_{j}}

Usage

symmetrytest(kmodel)

Arguments

kmodel

The kardl obejct

Details

This function performs symmetry tests on non-linear KARDL models to assess whether the effects of positive and negative changes in independent variables are statistically different.

This function evaluates whether the inclusion of a particular variable in the model follows a linear relationship or exhibits a non-linear pattern. By analyzing the behavior of the variable, the function helps to identify if the relationship between the variable and the outcome of interest adheres to a straight-line assumption or if it deviates, indicating a non-linear interaction. This distinction is important in model specification, as it ensures that the variable is appropriately represented, which can enhance the model's accuracy and predictive performance.

Value

A list with class "kardl" containing the following components:

References

Shin, Y., Yu, B., & Greenwood-Nimmo, M. (2014). Modelling asymmetric cointegration and dynamic multipliers in a nonlinear ARDL framework. Festschrift in honor of Peter Schmidt: Econometric methods and applications, 281-314.

See Also

kardl, pssf, psst, ecm, narayan

Examples


kardl_model<-kardl(imf_example_data,
                   CPI~Lasym(PPI+ER)+Sas(ER)+deterministic(covid)+trend)
ast<- symmetrytest(kardl_model)
ast
# Detailed results of the test:
summary(ast)
# The null hypothesis of the test is that the model is symmetric, while the alternative
# hypothesis is that the model is asymmetric. The test statistic and p-value are provided
# in the output. If the p-value is less than a chosen significance level (e.g., 0.05),
# we reject the null hypothesis and conclude that there is evidence of asymmetry in the model.

# To get symmetry test results in long-run, you can use the following code:
ast$Lwald

# To get symmetry test results in short-run, you can use the following code:
ast$Swald

# To get the null and alternative hypotheses of the test in long-run,
# you can use the following code:

ast$Lhypotheses

# To get the null and alternative hypotheses of the test in short-run,
# you can use the following code:

ast$Shypotheses

# Using magrittr package
library(magrittr)
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER+PPI)+deterministic(covid)+trend,
                           mode=c(1,0,1,1,0)) %>% symmetrytest()

# To get the summary of the symmetry test results in one line, you can use the following code:
imf_example_data %>% kardl(CPI~ER+PPI+asym(ER+PPI)+deterministic(covid)+trend,
                           mode=c(1,0,1,1,0)) %>% symmetrytest() %>% summary()