Statistics, Optimization & Information Computing <p><em><strong>Statistics, Optimization and Information Computing</strong></em>&nbsp;(SOIC) is an international refereed journal dedicated to the latest advancement of statistics, optimization and applications in information sciences.&nbsp; Topics of interest are (but not limited to):&nbsp;</p> <p>Statistical theory and applications</p> <ul> <li class="show">Statistical computing, Simulation and Monte Carlo methods, Bootstrap,&nbsp;Resampling methods, Spatial Statistics, Survival Analysis, Nonparametric and semiparametric methods, Asymptotics, Bayesian inference and Bayesian optimization</li> <li class="show">Stochastic processes, Probability, Statistics and applications</li> <li class="show">Statistical methods and modeling in life sciences including biomedical sciences, environmental sciences and agriculture</li> <li class="show">Decision Theory, Time series&nbsp;analysis, &nbsp;High-dimensional&nbsp; multivariate integrals,&nbsp;statistical analysis in market, business, finance,&nbsp;insurance, economic and social science, etc</li> </ul> <p>&nbsp;Optimization methods and applications</p> <ul> <li class="show">Linear and nonlinear optimization</li> <li class="show">Stochastic optimization, Statistical optimization and Markov-chain etc.</li> <li class="show">Game theory, Network optimization and combinatorial optimization</li> <li class="show">Variational analysis, Convex optimization and nonsmooth optimization</li> <li class="show">Global optimization and semidefinite programming&nbsp;</li> <li class="show">Complementarity problems and variational inequalities</li> <li class="show"><span lang="EN-US">Optimal control: theory and applications</span></li> <li class="show">Operations research, Optimization and applications in management science and engineering</li> </ul> <p>Information computing and&nbsp;machine intelligence</p> <ul> <li class="show">Machine learning, Statistical learning, Deep learning</li> <li class="show">Artificial intelligence,&nbsp;Intelligence computation, Intelligent control and optimization</li> <li class="show">Data mining, Data&nbsp;analysis, Cluster computing, Classification</li> <li class="show">Pattern recognition, Computer vision</li> <li class="show">Compressive sensing and sparse reconstruction</li> <li class="show">Signal and image processing, Medical imaging and analysis, Inverse problem and imaging sciences</li> <li class="show">Genetic algorithm, Natural language processing, Expert systems, Robotics,&nbsp;Information retrieval and computing</li> <li class="show">Numerical analysis and algorithms with applications in computer science and engineering</li> </ul> en-US <span>Authors who publish with this journal agree to the following terms:</span><br /><br /><ol type="a"><ol type="a"><li>Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a <a href="" target="_new">Creative Commons Attribution License</a> that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.</li><li>Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.</li><li>Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See <a href="" target="_new">The Effect of Open Access</a>).</li></ol></ol> (David G. Yu) (IAPress technical support) Sat, 03 Jun 2023 12:33:29 +0800 OJS 60 Bayesian and Classical Inference for Generalized Stress-Strength Parameter Under Generalized Logistic Distribution <p>In this paper, we study generalized stress-strength model for generalized logistic distribution. The maximum likelihood estimator of this quantity is obtained and then a confidence interval is presented for it. Bayesian and bootstrap methods are also applied for the recommended model. A Markov Chain Monte Carlo (MCMC) simulation study for assessing the estimation methods is performed via the Metropolis-Hastings algorithm in each step of Gibbs algorithm. An application to real data set is addressed.</p> Mohammad Mehdi Saber; Haitham Yousof Copyright (c) 2021 Statistics, Optimization & Information Computing Wed, 15 Dec 2021 00:00:00 +0800 A New Weighted Half-Logistic Distribution:Properties, Applications and Different Method of Estimations <p>In this paper, we introduce a new two-parameter lifetime distribution based on arctan function which is called weighted Half-Logistic (WHL) distribution. Theoretical properties of this model including quantile function, extreme value, linear combination for pdf and cdf, moments, conditional moments, moment generating function and mean deviation are derived and studied in details. The maximum likelihood estimates of parameters are compared with various methods of estimations by conducting a simulation study. Finally, two real data sets show that this model p[rovide better fit than other competitive known models.</p> Majid Hashempour, Morad Alizadeh Copyright (c) 2021 Statistics, Optimization & Information Computing Sat, 03 Jun 2023 12:01:47 +0800 On the Use of the Power Transformation Models to Improve the Temperature Time Series <p>The aim of this paper is to select an appropriate ARIMA model for the time series after transforming the original responses. Box-Cox and Yeo-Johnson power transformation models were used on the response variables of two time series datasets of average temperatures and then diagnosed and built the appropriate ARIMA models for each time-series. The authors treat the results of the model fitting as a package in an attempt to decide and choose the best model by diagnosing the effect of the data transformation on the response normality, significant of estimated model parameters, forecastability and the behavior of the residuals. The authors conclude that the Yeo-Johnson model was more flexible in smoothing the data and contributedto accessing a simple model with good forecastability.</p> Sameera Abdulsalam Othman, Haithem Taha Mohammed Ali Copyright (c) 2022 Statistics, Optimization & Information Computing Sat, 03 Jun 2023 12:03:04 +0800 Reliability Analysis of Exponentiated Exponential Distribution for Neoteric and Ranked Sampling Designs with Applications <p>The neoteric ranked set sampling (NRSS) scheme is an effective design compared to the usually ranked set sampling (RSS) scheme. Herein, we regard reliability estimation of the stress-strength (SS) model using the maximum likelihood procedure via NRSS and RSS designs. Assume that stress Y and strength X are exponentiated exponential random variables with the same scale parameter. Various sample strategies are used to evaluate the reliability estimator. We acquire an estimate of R when the samples of stress and strength random variables are chosen from the same sampling methods, such as RSS or NRSS. Furthermore, we derive R estimator when X and Y are chosen from RSS and NRSS, respectively, and vice versa. A simulation investigation is formed to assay and compare the accuracy of estimates for all proposed schemes. We conclude based on study outcomes that the reliability estimates of the stress-strength model via NRSS are more efficient than the others via RSS. Analysis of real data is displayed to investigate the usefulness of the proposed estimators.</p> Amal Hassan, Rasha Elshaarawy, Heba Nagy Copyright (c) 2023 Statistics, Optimization & Information Computing Sun, 08 Jan 2023 00:00:00 +0800 A New Two-Sided Class of Lifetime Distributions: Applications to Complete and Right Censored Data <p>In this article, we first define a new two-sided distribution called the two-sided Kumaraswamy distribution and then we propose a generalized class of lifetime distributions via compounding two-sided Kumaraswamy and a baseline distribution. One of the advantages of this class of new distributions is that they can be unimodal or bimodal. The general model is specified by taking the exponential distribution as the baseline distribution. Some basic properties of the proposed distribution are derived. The model parameters are estimated by means of maximum likelihood method. In addition, parametric and non-parametric bootstrap procedures are used to obtain point estimates and confidence intervals of the parameters of the model. A simulation study has been conducted to examine the bias and the mean square error of the maximum likelihood estimators. We illustrate the performance of the proposed distribution by means of two real data sets (one is complete data set and other is right censored data set) and both the data sets show that the new distribution is more appropriate as compared to Weibull, gamma, weighted exponential, generalized two-sided exponential, generalized transmuted two-sided exponential and generalized exponential distributions.</p> Omid Kharazmi, Fatemeh Jamali Paghale, Ali Saadati Nik, Sanku Dey, Morad Alizadeh Copyright (c) 2023 Statistics, Optimization & Information Computing Sat, 14 Jan 2023 00:00:00 +0800 A New Weighted Topp-Leone Family of Distributions <p>Based on T-X transform due to Alzaatreh et al. (2013), we propose the new weighted Topp-Leone (NWTL-Π) continuous statistical distributions with two extra shpae parameters .Then we study some basic mathematical properties. Then we study Uniform model as member of the new class with more details. Using a simulation study, we compared some methods of estimation. Finally we analyzed and used lifetime and failure time real data sets to illustrate the purposes.</p> Gorgees Shaheed Copyright (c) 2023 Statistics, Optimization & Information Computing Fri, 17 Mar 2023 00:00:00 +0800 Discrete Logistic Exponential Distribution with Application <p>In this paper, a new two-parameter discrete logistic exponential distribution is proposed based on the survival discretization approach. Some statistical properties are derived, and it is found that the proposed model can be used to discuss several kinds of failure rates including unimodal, bathtub, and increasing-shaped. Moreover, it can be utilized effectively to model under- and over-dispersed data. The distribution parameters are estimated using the maximum likelihood technique. The behavior of maximum likelihood estimators is assessed utilizing a comprehensive simulation study. In the end, two real data are analyzed to show the usefulness of the new discrete distribution.</p> Afrah Al-Bossly, M. S Eliwa, Muhammad Ahsan-Ul-Haq, Mahmoud El-morshedy Copyright (c) 2023 Statistics, Optimization & Information Computing Sat, 18 Mar 2023 00:00:00 +0800 Semi-infinite Mathematical Programming Problems involving Generalized Convexity <p>In this paper, we consider semi-infinite mathematical programming problems with<br>equilibrium constraints (SIMPEC). By using the notion of convexificators, we establish sufficient optimality conditions for the SIMPEC. We formulate Wolfe and Mond-Weir type dual models for the SIMPEC under generalized convexity assumptions. Moreover, weak and strong duality theorems are established to relate the SIMPEC and two dual programs in the framework of convexificators.</p> Bhuwan C. Joshi Copyright (c) 2023 Statistics, Optimization & Information Computing Sat, 14 Jan 2023 00:00:00 +0800 D- And A- Optimal Orthogonally Blocked Mixture Component-Amount Designs via Projections <p>Mixture experiments are usually designed to study the effects on the response by changing the relative proportions of the mixture ingredients. This is usually achieved by keeping the total amount fixed but in many practical applications such as medicine or biology, not only are the proportions of mixture ingredients involved but also their total amount is of particular interest. Such experiments are called mixture amount experiments. In such experiments, the usual constraint on the mixture proportions that they should sum to unity is relaxed. The optimality of the design strictly depends on the nature of the underlying model. In this paper, we have obtained D- and A- optimal orthogonally blocked mixture component-amount designs in two and three ingredients via projections based on the reduced cubic canonical model presented by Husain and Sharma [7] and the additive quadratic mixture model proposed by Husain and Parveen [3], respectively.</p> Bushra Husain, Afrah Hafeez Copyright (c) 2023 Statistics, Optimization & Information Computing Wed, 18 Jan 2023 00:00:00 +0800 Grey Median Problem and Vertex Optimality <p>The median problem is a basic model in location theory and transportation sciences. This problem deals with locating a facility on a network, to minimize the sum of weighted distances between the facility and the vertices of the network. In this paper, the cases that weights of vertices, edge lengths or both of them are grey numbers, are considered. For all these cases, we show that the set of vertices of network contains a solution of the median problem. This property is called vertex optimality. Median problem with grey parameters and its properties are first considered in this paper.</p> Jafar Fathali Copyright (c) 2023 Statistics, Optimization & Information Computing Wed, 18 Jan 2023 00:00:00 +0800 Copy-Move Forgery Detection Using an Equilibrium Optimization Algorithm (CMFDEOA) <p>Image forgery detection is a new challenge. One type of image forgery is a copy-move forgery. In this method, part of the image is copied and placed at the most similar point. Given the existing algorithms and processing software, identifying forgery areas is difficult and has created challenges in various applications. The proposed method based on the Equilibrium Optimization Algorithm (EOA) helps image forgery detection by finding forgery areas. The proposed method includes feature detection, image segmentation, and detection of forgery areas using the EOA algorithm. In the first step, the image converts to a grayscale. Then, with the help of a discrete cosine transform (DCT) algorithm, it is taken to the signal domain. With the help of discrete wavelet transform (DWT), its appropriate properties are introduced. In the next step, the image is divided into blocks of equal size. Then the similarity search is performed with the help of an equilibrium optimization algorithm and a suitable proportion function. Copy-move forgery detection using the Equilibrium Optimization Algorithm (CMFDEOA) can find areas of forgery with an accuracy of about 86<em>.</em>21% for the IMD data set and about 83<em>.</em>98% for the MICC-F600 data set.</p> Ehsan Amiri, Ahmad Mosallanejad, Amir Sheikhahmadi Copyright (c) 2023 Statistics, Optimization & Information Computing Thu, 20 Apr 2023 00:00:00 +0800 A Note on a Strong Persistence of Stochastic Predator-Prey Model with Jumps <p>We study the non-autonomous stochastic predator-prey model with a modified version of Leslie-Gower term and Holling-type II functional response driven by the system of stochastic differential equations with white noise, centered and non-centered Poisson noises. The sufficient conditions of strong persistence in the mean of the solution to the considered system are obtained.</p> Olga Borysenko, Oleksandr Borysenko Copyright (c) 2023 Statistics, Optimization & Information Computing Sat, 01 Apr 2023 00:00:00 +0800 Analysis of Dependent Variables Following Marshal- Olkin Bivariate Distributions in the Presence of Progressive Type II Censoring <p>In this paper, the likelihood function under progressive Type II censoring is generalized for Marshal-Olkin bivariate class of distributions and applied it on the bivariate Dagum distribution. Maximum likelihood estimation is considered for the model unknown parameters. Asymptotic and bootstrap confidence intervals for the unknown parameters are evaluated under progressive Type II censoring. Bayesian estimation is also considered in both complete and progressive Type II censored samples; moreover, the Bayes estimators are obtained explicitly with respect to square error loss function in both cases.</p> Hiba Muhammed Copyright (c) 2023 Statistics, Optimization & Information Computing Sat, 01 Apr 2023 00:00:00 +0800 Comparison of E-Bayesian Estimators in Burr XII Model Using E-PMSE Based on Record Values <p>In this paper, we consider the problem of E-Bayesian estimation and its expected posterior mean squared error (E-PMSE) in a Burr type XII model on the basis of record values. The Bayesian and E-Bayesian estimators are computed under different prior distributions for hyperparameters. The E-PMSE of E-Bayesian estimators are calculated in order to measure the estimated risk. Performances of the E-Bayesian estimators are compared using a Monte Carlo simulation. A real data set is analyzed for illustrating the estimation results.</p> Alla Alhamidah, Mehran Naghizadeh Qmi, Azadeh Kiapour Copyright (c) 2023 Statistics, Optimization & Information Computing Thu, 20 Apr 2023 00:00:00 +0800 On Testing the Adequacy of the Lindley Model and Power Study <p>The Lindley distribution may serve as a useful reliability model. Applications of this distribution are presented in statistical literature. In this article, goodness of fit tests for the Lindley distribution based on the empirical distribution function (EDF) are considered. In order to compute the test statistics, we use the maximum likelihood estimate (MLE) suggested by Ghitany et al. (2008), which is simple explicit estimator. Critical points of the proposed test statistics are obtained by Monte Carlo simulation. Power comparisons of the considered tests are carried out via simulations. Finally, two illustrative examples are presented and analyzed.</p> Hadi Alizadeh Noughabi, Mohammad Shafaei Noughabi Copyright (c) 2023 Statistics, Optimization & Information Computing Thu, 20 Apr 2023 00:00:00 +0800 Prediction of Characteristics Using a Convolutional Neural Network Based on Experimental Data on the Structure and Composition of Metamaterials <p>This work proposes an algorithm for properties predicting metamaterials depending on their structure, physical properties of the components of metamaterials, and their characteristics. In this context, the term ”properties” means the result of interacting with the irradiation of a material with electromagnetic exposure of a certain frequency or spectral composition to determine the transmittance/reflection coefficients of the metamaterial. The model is based on the construction of metamaterial in form of a 3D object, the presentation of physical properties in the form of additional components in the object’s vectors, the presentation of experimental data in the form of polynomial coefficients, or the points on the chart of dependencies. Despite the small amount of data, a sufficiently small error rate was obtained for both cases, and the prediction results of experimental data are presented. The amount of experimental data can be increased by supplementary parameters which characterize the conditions under which the experimental data were obtained - polarization, angle of incidence, the intensity of irradiation, etc. The main issues may arise during the preparation of data for neural network learning due to difficulties in converting 3D formats into the required array of data and taking into account all the circumstances, dielectric and magnetic permeabilities, and specific conductivity.</p> Maxim Zozyuk, Dmitri Koroliouk, Pavel Krysenko, Alexei Yurikov, Yuriy Yakymenko Copyright (c) 2023 Statistics, Optimization & Information Computing Thu, 20 Apr 2023 00:00:00 +0800 On Past Extropy and Negative Cumulative Extropy Properties of Ranked Set Sampling and Maximum Ranked Set Sampling with Unequal Samples <p>Ranked set sampling is considered as an alternative to simple random sampling and maximum ranked set sampling is a very useful modification of ranked set sampling. In this paper we focused on information content of ranked set sampling and maximum ranked set sampling with unequal samples in terms of past extropy measure and also considered the information content of negative cumulative extropy and its dynamic version based on maximum ranked set sampling and simple random sampling designs. We also compare ranked set sampling data, maximum ranked set sampling data with simple random sampling and with each other. Also here we obtained a new discrimination information measure among simple random sampling data, ranked set sampling data and maximum ranked set sampling data for past extropy measure.</p> Irshad M R, Maya R, Archana K, Tahmasebi S Copyright (c) 2023 Statistics, Optimization & Information Computing Fri, 21 Apr 2023 00:00:00 +0800 The Balakrishnan-Alpha-Beta-Skew-Laplace Distribution: Properties and Applications <p>In this paper, a new form of alpha-beta-skew-Laplace distribution is proposed under Balakrishnan [3] mechanism and investigated some of its related distributions. The moments, distributional properties and some extensions of the proposed distribution have also studied. Finally, the suitability and the appropriateness of the proposed distribution has tested by conducting data fitting experiment and comparing the values of Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) with the values of some other related distributions. Likelihood Ratio test is used for discriminating between the nested models.</p> Sricharan Shah, Partha Jyoti Hazarika , Subrata Chakraborty , Morad Alizadeh Copyright (c) 2022 Statistics, Optimization & Information Computing Tue, 30 Aug 2022 00:00:00 +0800 A Primal-Dual Interior-Point Algorithm Based on a Kernel Function with a New Barrier Term <p>In this paper, we propose a path-following interior-point method (IPM) for solving linear optimization (LO) problems based on a new kernel function (KF). The latter differs from other KFs in having an exponential-hyperbolic barrier term that belongs to the hyperbolic type, recently developed by I. Touil and W. Chikouche \cite{filomat2021,acta2022}. The complexity analysis for large-update primal-dual IPMs based on this KF yields an $\mathcal{O}\left( \sqrt{n}\log^2n\log \frac{n}{\epsilon }\right)$ iteration bound which improves the classical iteration bound. For small-update methods, the proposed algorithm enjoys the favorable iteration bound, namely, $\mathcal{O}\left( \sqrt{n}\log \frac{n}{\epsilon }\right)$. We back up these results with some preliminary numerical tests which show that our algorithm outperformed other algorithms with better theoretical convergence complexity. To our knowledge, this is the first feasible primal-dual interior-point algorithm based on an exponential-hyperbolic KF.</p> Safa Guerdouh, Wided Chikouche, Imene Touil Copyright (c) 2023 Statistics, Optimization & Information Computing Fri, 21 Apr 2023 00:00:00 +0800 Self-Scheduling of a Generation Company with Carbon Emission Trading <p>A carbon emission trading self-scheduling (CETSS) model was proposed. The proposed model considered not only carbon emission allowance constraints but also carbon emission trading. A new method was presented for solving CETSS problems based on piece-wise linearisation and second-order cone linearisation. The effectiveness and validity of the proposed model and method were illustrated by 10-100 unit systems over 24 hours.</p> Sidong Liu, Handong Cao, Xijian Wang Copyright (c) 2023 Statistics, Optimization & Information Computing Thu, 11 May 2023 00:00:00 +0800