首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
逻辑斯蒂曲线的几个推广模型与应用   总被引:12,自引:0,他引:12  
本给出逻辑斯蒂曲线的三个推广模型,即广义逻辑斯蒂曲线模型、含季节性变化的逻辑斯蒂曲线模型、合成逻辑斯蒂曲线模型及其模型参数估计的方法,井对我国汽车生产量作出了很好的预测。  相似文献   

2.
根据基于GM(1,1)模型的预测型线性规划思想方法,建立了中心逼近式GM(1,1)模型,从而给出对灰色预测型线性规划的改进。  相似文献   

3.
模型论逻辑与理论计算机科学   总被引:2,自引:0,他引:2  
沈恩绍 《数学进展》1996,25(3):193-202
近年来,逻辑中语义的思想与方法在理论计算机科学的许多分支中的渗透与应用,已愈来愈受重视.作为语义方法的逻辑基础,(一阶)模型论是研究(一阶)逻辑的语法构造与语义属性之间联系的一门数理逻辑的分支;而模型论逻辑(又称广义模型论)则是在抽象逻辑的框架中,用模型论的方法研究各种扩充逻辑系统的异、同及相互关系.本文从抽象逻辑的观点出发.介绍模型论中与计算机科学(CS)密切相关的若干概念及其应用.特别是广义的有限模型论,它在CS的刺激下于80年代形成并急速发展起来,已在数据库、计算复杂性以及形式语言与自动机等理论中取得突出成果或重大的应用.  相似文献   

4.
数据挖掘技术能有效地挖掘出潜在的银行客户,能够大大提高银行的竞争力.介绍了数据挖掘技术中常用的三种模型:逻辑回归模型、BP神经网络模型和决策树模型,同时构造了一种新模型——逻辑回归与BP神经网络混合的模型,然后分别采用这四种模型对可能影响银行客户是否认购定期存款的影响因素进行数据挖掘分析,分别构建了基于逻辑回归模型、BP神经网络模型、逻辑回归与BP神经网络的新模型、决策树模型的银行客户定期存款认购的四种模型,同时利用R语言分别对这四种模型进行分析,分别用ROC曲线的AUC值和正确率比较这四种模型的功效强弱以及稳定性,研究结果表明,给出的新模型——逻辑回归与BP神经网络的新模型的预测效果更好,训练集和测试集预测的准确率分别为0.936和0.931,训练集和测试集ROC曲线的AUC值分别为0.998和0.987,这可以大大缩小银行推送认购定期存款的客户范围,有效地挖掘出潜在的银行客户,可以大大提高银行的效率.  相似文献   

5.
基于支持向量机的中国工业增加值预测研究   总被引:1,自引:0,他引:1  
工业增加值是衡量一个国家工业发展水平的重要指标。由于其受多种因素影响,对其预测相对困难。本文提出运用时间序列预测方法对其预测,并利用支持向量机和微分进化算法(differential evolution,DE)相结合的方法对中国工业增加值数据进行预测。数据仿真显示该模型比核主成分分析的最小二乘支持向量机(KPCA-LS-SVM)以及岭回归(ridge regression,RR)具有更高的预测精度。  相似文献   

6.
组合预测方法在大气环境评价中的应用   总被引:9,自引:0,他引:9  
李振亮.组合预测方法在大气环境评价中的应用.数理统计与管理,1997,16(4),12~15.本文利用组合预测理论和方法,建立了一种大气环境质量评价的组合模型。实例分析表明,该组合评价模型具有利用各单一评价模型提供的信息和集中各单一模型的优点,是一种稳健和优化的环境质量评价模型。  相似文献   

7.
基于灰导数和预测系数的GM(1,1)优化模型   总被引:1,自引:0,他引:1  
针对GM(1,1)模型的适用范围是近指数情况,提出了将优化灰导数与利用原始序列模拟的相对误差平方和最小估计预测系数c相结合的方法,从而得到一种简化计算的新GM(1,1)优化模型,该模型的预测公式x(0)(k)=ce-ak在形式上比较简洁,并且经严格指数序列从理论上验证了参数a具有白化指数律重合性,预测系数c具有白化系数重合性.  相似文献   

8.
非均质复合材料的宏观力学性能往往取决于细观组分的分布方式和力学性能,但是建立明确的关系表达式极其困难。为了应对这一挑战,以混凝土为研究对象,提出了一种基于深度学习的策略,能够高效、准确地通过细观模型图像信息获取应力-应变曲线。首先,使用基于卷积神经网络(convolutional neural network,CNN)的GoogLeNet模型进行图像信息识别和提取,并针对应力-应变曲线的复杂性特点,进行了数据预处理操作,并且设计了相应的多任务损失函数。数据集中的细观模型图像采用基于Monte-Carlo的随机骨料模型生成,并且使用数值模拟试验获取对应细观模型的单轴压缩应力-应变曲线。最后,通过对神经网络的训练和测试评估了所提出方法的可行性。结果表明,GoogLeNet模型训练效率和预测精度均优于AlexNet和ResNet模型,具有良好的泛化能力和鲁棒性。  相似文献   

9.
以因肝硬化行门脉高压脾切除术的患者为研究对象,观察并记录患者术后并发门脉血栓(PVT)的情况,收集所有患者的P-选择素、血栓前体蛋白(TPP)和D2聚体(D2)含量(术后第1,3,5,7,14天)的临床观察指标数据,分析各临床观察指标与并发PVT的关系.建立相应的统计模型,并利用受试者工作特征曲线(ROC曲线)方法,比较各指标在单独和联合预测PVT时的价值和意义,确定最佳的术后预测时间和阈值,进而对PVT的形成进行较准确、早期的预测,为临床防治相关患者并发PVT打下基础.  相似文献   

10.
姚平 《运筹与管理》2009,18(5):153-157
结合煤炭企业实际,提出了煤炭企业可持续发展的指标体系。应用区间判断层次分析法(IAHP)的理论与方法,结合熵理论(Entropy),建立了主观与客观集成确权的IAHP—Entropy综合评价模型。该方法既能有效避免单一主观确权的人为随意性,又能克服单一客观确权无法反映专家经验和决策者偏好的缺点。通过将该模型的应用,验证了该方法的科学性和实践可行性。  相似文献   

11.
In the consumer credit industry, assessment of default risk is critically important for the financial health of both the lender and the borrower. Methods for predicting risk for an applicant using credit bureau and application data, typically based on logistic regression or survival analysis, are universally employed by credit card companies. Because of the manner in which the predictive models are fit using large historical sets of existing customer data that extend over many years, default trends, anomalies, and other temporal phenomena that result from dynamic economic conditions are not brought to light. We introduce a modification of the proportional hazards survival model that includes a time-dependency mechanism for capturing temporal phenomena, and we develop a maximum likelihood algorithm for fitting the model. Using a very large, real data set, we demonstrate that incorporating the time dependency can provide more accurate risk scoring, as well as important insight into dynamic market effects that can inform and enhance related decision making.  相似文献   

12.
Logistic regression is a simple and efficient supervised learning algorithm for estimating the probability of an outcome or class variable. In spite of its simplicity, logistic regression has shown very good performance in a range of fields. It is widely accepted in a range of fields because its results are easy to interpret. Fitting the logistic regression model usually involves using the principle of maximum likelihood. The Newton–Raphson algorithm is the most common numerical approach for obtaining the coefficients maximizing the likelihood of the data. This work presents a novel approach for fitting the logistic regression model based on estimation of distribution algorithms (EDAs), a tool for evolutionary computation. EDAs are suitable not only for maximizing the likelihood, but also for maximizing the area under the receiver operating characteristic curve (AUC). Thus, we tackle the logistic regression problem from a double perspective: likelihood-based to calibrate the model and AUC-based to discriminate between the different classes. Under these two objectives of calibration and discrimination, the Pareto front can be obtained in our EDA framework. These fronts are compared with those yielded by a multiobjective EDA recently introduced in the literature.   相似文献   

13.
基于中国人口死亡率数据, 对APC模型进行扩展, 并将扩展的死亡率模型(EAPC模型)与APC模型和LC模型进行对比. 通过比较模型的拟合效果和预测效果, 并对其稳定性进行检验, 发现由APC模型扩展而来的EAPC模型更适合于拟合和预测中国的人口死亡率, 这为我国死亡率模型的使用提供了更多可行的方案.  相似文献   

14.
This paper presents an attempt to make reliable projections of the stock of agricultural tractors in Spain. The approach to be followed is that of fitting a logistic trend to the historical data for the period 1951-1976. A set of possible trend curves is presented and their properties discussed; reasons for choosing the logistic trend are given. Preliminary estimates for the parameters of the logistic are obtained by means of the relatively unsophisticated three point method. Alternative heteroscedasticity assumptions in the data are explored with the help of a non-linear regression. Final estimates are obtained by means of a non-linear optimization algorithm. The influence of economic variables and government policies is traced in the residuals.  相似文献   

15.
Abstract

The “leapfrog” hybrid Monte Carlo algorithm is a simple and effective MCMC method for fitting Bayesian generalized linear models with canonical link. The algorithm leads to large trajectories over the posterior and a rapidly mixing Markov chain, having superior performance over conventional methods in difficult problems like logistic regression with quasicomplete separation. This method offers a very attractive solution to this common problem, providing a method for identifying datasets that are quasicomplete separated, and for identifying the covariates that are at the root of the problem. The method is also quite successful in fitting generalized linear models in which the link function is extended to include a feedforward neural network. With a large number of hidden units, however, or when the dataset becomes large, the computations required in calculating the gradient in each trajectory can become very demanding. In this case, it is best to mix the algorithm with multivariate random walk Metropolis—Hastings. However, this entails very little additional programming work.  相似文献   

16.
17.
This paper proposes a novel algorithm to reconstruct an unknown distribution by fitting its first-four moments to a proper parametrized probability distribution (PPD) model. First, a PPD system containing three previously developed PPD models is suggested to approximate the unknown distribution, rather than empirically adopting a single distribution model. Then, a two-step algorithm based on the moments matching criterion and the maximum entropy principle is proposed to specify the appropriate (final) PPD model in the system for the distribution. The proposed algorithm is first verified by approximating several commonly used analytical distributions, along with a set of real dataset, where the existing measures are also employed to demonstrate the effectiveness of the proposed two-step algorithm. Further, the effectiveness of the algorithm is demonstrated through an application to three typical moments-based reliability problems. It is found that the proposed algorithm is a robust tool for selecting an appropriate PPD model in the system for recovering an unknown distribution by fitting its first-four moments.  相似文献   

18.
In this paper an implementation is discussed of a modified CANDECOMP algorithm for fitting Lazarsfeld's latent class model. The CANDECOMP algorithm is modified such that the resulting parameter estimates are non-negative and ‘best asymptotically normal’. In order to achieve this, the modified CANDECOMP algorithm minimizes a weighted least squares function instead of an unweighted least squares function as the traditional CANDECOMP algorithm does. To evaluate the new procedure, the modified CANDECOMP procedure with different weighting schemes is compared on five published data sets with the widely-used iterative proportional fitting procedure for obtaining maximum likelihood estimates of the parameters in the latent class model. It is found that, with appropriate weights, the modified CANDECOMP algorithm yields solutions that are nearly identical with those obtained by means of the maximum likelihood procedure. While the modified CANDECOMP algorithm tends to be computationally more intensive than the maximum likelihood method, it is very flexible in that it easily allows one to try out different weighting schemes.  相似文献   

19.
一种包含递归的核回归估计的回归预测模型   总被引:1,自引:0,他引:1  
本文首先构造了包含递归的核回归的回归模型 ,然后将其应用于湖南省城镇居民消费性支出 ,提高了拟合度 ,减少了预测误差  相似文献   

20.
南通地区月降水量时间序列分析   总被引:2,自引:1,他引:1  
根据南通地区1989年-2005年月降水量数据,在统计检验其平稳性、纯随机性的基础上,结合谱分析,建立该地区具有季节效应的疏系数ARIMA月降水量时间序列模型,对模型作了拟合预测检验.研究表明,多个模型的联合使用比单一模型更利于准确拟合预测.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号