排序方式: 共有78条查询结果,搜索用时 0 毫秒
1.
We report on the development of a fiber-based laser transmitter designed for active remote sensing spectroscopy. The transmitter uses a master oscillator power amplifier (MOPA) configuration with a distributed feedback diode-laser master oscillator and an erbium-doped fiber amplifier. The output from the MOPA is frequency-doubled with a periodically poled potassium titanium oxide phosphate crystal. With 35 W of single-frequency peak optical pump power, 8 W of frequency-doubled peak power was achieved. The utility of this single-frequency, wavelength tunable, power scalable laser was then demonstrated in a spectroscopic measurement of diatomic oxygen A band. 相似文献
2.
Stereology versus planimetry to estimate the volume of malignant liver lesions on MR imaging 总被引:4,自引:0,他引:4
Mazonakis M Damilakis J Mantatzis M Prassopoulos P Maris T Varveris H Gourtsoyiannis N 《Magnetic resonance imaging》2004,22(7):1011-1016
Liver tumor volume measurements are clinically useful in patients undergoing cancer treatment. The techniques of planimetry and stereology were applied for this purpose on magnetic resonance (MR) imaging. Fifty-eight malignant liver lesions were depicted on MR images in 20 consecutive patients. The volume of all lesions was estimated using stereology technique, based on point counting. Stereological tumor volume estimations were compared with those determined by manual planimetry. The repeatability of both techniques was assessed. Tumor volumes estimated by the two techniques were highly correlated (r = 0.98, p < 0.0001). The 95% limits of agreement showed that the stereological volume estimations may differ from the planimetric assessments by less than 23%. Both techniques presented comparable intra- and interobserver variability. The planimetry was 1.5 times faster than the stereology. Both volumetric techniques may provide reliable and reproducible liver tumor volume estimations. The planimetry may be the method of choice because of its superior speed. 相似文献
3.
Bello D Einhorn A Kaushal R Kenchaiah S Raney A Fieno D Narula J Goldberger J Shivkumar K Subacius H Kadish A 《Magnetic resonance imaging》2011,29(1):50-56
Background
Cardiac magnetic resonance imaging (CMR) can accurately determine infarct size. Prior studies using indirect methods to assess infarct size have shown that patients with larger myocardial infarctions have a worse prognosis than those with smaller myocardial infarctions.Objectives
This study assessed the prognostic significance of infarct size determined by CMR.Methods
Cine and contrast CMR were performed in 100 patients with coronary artery disease (CAD) undergoing routine cardiac evaluation. Infarct size was determined by planimetry. We used Cox proportional hazards regression analyses (stepwise forward selection approach) to evaluate the risk of all-cause death associated with traditional cardiovascular risk factors, symptoms of heart failure, medication use, left ventricular ejection fraction, left ventricular mass, angiographic severity of CAD and extent of infarct size determined by CMR.Results
Ninety-one patients had evidence of myocardial infarction by CMR. Mean follow-up was 4.8±1.6 years after CMR, during which time 30 patients died. The significant multivariable predictors of all-cause mortality were extent of myocardial infarction by CMR, extent of left ventricular systolic dysfunction, symptoms of heart failure, and diabetes mellitus (P<.05). The presence of infarct greater than or equal to 24% of left ventricular mass and left ventricular ejection fraction less than or equal to 30% were the most optimal cut-off points for the prediction of death with bivariate adjusted hazard ratios of 2.11 (95% confidence interval 1.02-4.38) and 4.06 (95% confidence interval 1.73-9.54), respectively.Conclusions
The extent of myocardial infarction determined by CMR is an independent predictor of death in patients with CAD. 相似文献4.
Haris K Efstratiadis SN Maglaveras N Pappas C Gourassas J Louridas G 《IEEE transactions on medical imaging》1999,18(10):1003-1015
5.
Implementation of new and innovative energy technologies is a key mean towards a sustainable energy system. Currently, governments have to decide from an increasingly diverse mix of them, the ones which warrant support, including funding and other incentives for private sector efforts. However, appraising energy technologies in terms of their sustainability is a really complex task, considering the series of uncertainties and implications that have to be encountered so as to obtain realistic and transparent results. In this context, the main aim of this paper is to present a direct and flexible multi-criteria decision making approach, using linguistic variables, to assist policy makers in formulating sustainable technological energy priorities. Furthermore, its software realization will be applied to a number of technologies, in the context of the Greek Technology Foresight Programme, and the results will be presented and discussed. 相似文献
6.
Haris Aziz 《Operations Research Letters》2013,41(5):499-502
Bankruptcy problems are a fundamental class of fair division problems in microeconomics. Among the various solution concepts proposed for the problem, the random arrival rule is one of the most prominent. In this paper, we conduct a computational analysis of the rule. It is shown that the allocation returned by the rule is #P-complete to compute. The general complexity result is complemented by a pseudo-polynomial-time dynamic programming algorithm for the random arrival rule. 相似文献
7.
Qureshi Rehan Mehboob Syed Haris Aamir Muhammad 《Wireless Personal Communications》2021,116(2):1379-1406
Wireless Personal Communications - Medical Body Area Networks or MBANs are gaining popularity in healthcare circles because of the convenience they provide to patients and caregivers and assist in... 相似文献
8.
Hamed Asadi Haris Volos Michael M. Marefat Tamal Bose 《Analog Integrated Circuits and Signal Processing》2017,91(2):173-185
A cognitive radio engine (CE) is an intelligent agent which observes the radio environment and chooses the best communication settings that best meet the application’s goal. In this process, providing reliable performance is one of the major tasks in designing CEs for wireless communication systems. The main purpose of this work is providing predictable performance and controlling the cost of intelligent algorithms based on the CE’s experience and complexity analysis respectively. In this work, we extend our meta-CE design to control the cost of computations and provide more reliable performance for providing the minimum requirement of the radio applications in different scenarios. To achieve this, we use robust training algorithm (RoTA) in two different levels alongside of the individual CE algorithms. The RoTA, enables radio to guarantee some minimum output performance based on the learning stages. RoTA uses confidence interval approximation for standard normal distribution to calculate the lower and upper bounds of CE’s expected performance to analyze the reliability of decisions. Moreover, in the case of non-stationary environments, RoTA is facilitated by forgetfulness factor to provide minimum performance guarantees. The second level of RoTA operates in meta-level to control the amount of computation complexity of intelligent algorithms in all levels with respect to the obtained performance and complexity analysis. 相似文献
9.
Dinesh DatlaAuthor Vitae Haris I. Volos Author VitaeS.M. Hasan Author Vitae Jeffrey H. Reed Author VitaeTamal Bose Author Vitae 《Ad hoc Networks》2012,10(5):845-857
Individual cognitive radio nodes in an ad-hoc cognitive radio network (CRN) have to perform complex data processing operations for several purposes, such as situational awareness and cognitive engine (CE) decision making. In an implementation point of view, each cognitive radio (CR) may not have the computational and power resources to perform these tasks by itself. In this paper, wireless distributed computing (WDC) is presented as a technology that enables multiple resource-constrained nodes to collaborate in computing complex tasks in a distributed manner. This approach has several benefits over the traditional approach of local computing, such as reduced energy and power consumption, reduced burden on the resources of individual nodes, and improved robustness. However, the benefits are negated by the communication overhead involved in WDC. This paper demonstrates the application of WDC to CRNs with the help of an example CE processing task. In addition, the paper analyzes the impact of the wireless environment on WDC scalability in homogeneous and heterogeneous environments. The paper also proposes a workload allocation scheme that utilizes a combination of stochastic optimization and decision-tree search approaches. The results show limitations in the scalability of WDC networks, mainly due to the communication overhead involved in sharing raw data pertaining to delegated computational tasks. 相似文献
10.
Dinesh Datla Haris I. Volos S. M. Hasan Jeffrey H. Reed Tamal Bose 《Analog Integrated Circuits and Signal Processing》2011,69(2-3):341-353
Wireless distributed computing (WDC) is an enabling technology that allows radio nodes to cooperate in processing complex computational tasks of an application in a distributed manner. WDC research is being driven by the fact that mobile portable computing devices have limitations in executing complex mobile applications, mainly attributed to their limited resource and functionality. This article focuses on resource allocation in WDC networks, specifically on scheduling and task allocation. In WDC, it is important to schedule communications between the nodes in addition to the allocation of computational tasks to nodes. Communication scheduling and heterogeneity in the operating environment make the WDC resource allocation problem challenging to address. This article presents a task allocation and scheduling algorithm that optimizes both energy consumption and makespan in a heuristic manner. The proposed algorithm uses a comprehensive model of the energy consumption for the execution of tasks and communication between tasks assigned to different radio nodes. The algorithm is tested for three objectives, namely, minimization of makespan, minimization of energy consumption, and minimization of both makespan and energy consumption. 相似文献