首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 531 毫秒
1.
能谱CT将宽谱划分为窄谱,导致通道内光子数目明显减少,加大了噪声影响,故从噪声投影中重建出高质量图像是能谱CT的一个研究热点.传统全变分(total variational,TV)容易造成重建图像中出现块状伪影等问题,总广义全变分(total generalized variation,TGV)算法可以逼近任意阶函数,再结合非局部均值算法的思想,同时考虑到不同能谱通道下重建图像的相关性,将高质量全能谱重建图像作为先验图像指导能谱CT重建,提出了基于先验图像约束压缩感知(prior image constrained compressed sensing,PICCS)的非局部TGV重建算法.实验结果表明,所提算法在抑制噪声的同时能够有效复原图像细节及边缘信息,且收敛速度快.  相似文献   

2.
Spectral reduction was originally formulated entirely in the wavenumber domain as a coarse-grained wavenumber convolution in which bins of modes interact with enhanced coupling coefficients. A Liouville theorem leads to inviscid equipartition solutions when each bin contains the same number of modes. A pseudospectral implementation of spectral reduction which enjoys the efficiency of the fast Fourier transform is described. The model compares well with full pseudospectral simulations of the two-dimensional forced-dissipative energy and enstrophy cascades.  相似文献   

3.
Three-dimensional orthogonal bin packing is a problem NP-hard in the strong sense where a set of boxes must be orthogonally packed into the minimum number of three-dimensional bins. We present a two-level tabu search for this problem. The first-level aims to reduce the number of bins. The second optimizes the packing of the bins. This latter procedure is based on the Interval Graph representation of the packing, proposed by Fekete and Schepers, which reduces the size of the search space. We also introduce a general method to increase the size of the associated neighborhoods, and thus the quality of the search, without increasing the overall complexity of the algorithm. Extensive computational results on benchmark problem instances show the effectiveness of the proposed approach, obtaining better results compared to the existing ones.  相似文献   

4.
In this paper we consider a class of bin selection and packing problems (BPP) in which potential bins are of various types, have two resource constraints, and the resource requirement for each object differs for each bin type. The problem is to select bins and assign the objects to bins so as to minimize the sum of bin costs while meeting the two resource constraints. This problem represents an extension of the classical two-dimensional BPP in which bins are homogeneous. Typical applications of this research include computer storage device selection with file assignment, robot selection with work station assignment, and computer processor selection with task assignment. Three solution algorithms have been developed and tested: a simple greedy heuristic, a method based onsimulated annealing (SA) and an exact algorithm based onColumn Generation with Branch and Bound (CG). An LP-based method for generating tight lower bounds was also developed (LB). Several hundred test problems based on computer storage device selection and file assignment were generated and solved. The heuristic solved problems up to 100 objects in less than a second; average solution value was within about 3% of the optimum. SA improved solutions to an average gap of less than 1% but a significant increase in computing time. LB produced average lower bounds within 3% of optimum within a few seconds. CG is practical for small to moderately-sized problems — possibly as many as 50 objects.  相似文献   

5.
The three-dimensional bin packing problem consists of packing a set of boxes into the minimum number of bins. In this paper we propose a new GRASP algorithm for solving three-dimensional bin packing problems which can also be directly applied to the two-dimensional case. The constructive phase is based on a maximal-space heuristic developed for the container loading problem. In the improvement phase, several new moves are designed and combined in a VND structure. The resulting hybrid GRASP/VND algorithm is simple and quite fast and the extensive computational results on test instances from the literature show that the quality of the solutions is equal to or better than that obtained by the best existing heuristic procedures.  相似文献   

6.
The bin packing problem is widely found in applications such as loading of tractor trailer trucks, cargo airplanes and ships, where a balanced load provides better fuel efficiency and safer ride. In these applications, there are often conflicting criteria to be satisfied, i.e., to minimize the bins used and to balance the load of each bin, subject to a number of practical constraints. Unlike existing studies that only consider the issue of minimum bins, a multiobjective two-dimensional mathematical model for bin packing problems with multiple constraints (MOBPP-2D) is formulated in this paper. To solve MOBPP-2D problems, a multiobjective evolutionary particle swarm optimization algorithm (MOEPSO) is proposed. Without the need of combining both objectives into a composite scalar weighting function, MOEPSO incorporates the concept of Pareto’s optimality to evolve a family of solutions along the trade-off surface. Extensive numerical investigations are performed on various test instances, and their performances are compared both quantitatively and statistically with other optimization methods to illustrate the effectiveness and efficiency of MOEPSO in solving multiobjective bin packing problems.  相似文献   

7.
Electron Paramagnetic Resonance (EPR) is a spectroscopic technique that detects and characterizes molecules with unpaired electrons (i.e., free radicals). Unlike the closely related nuclear magnetic resonance (NMR) spectroscopy, EPR is still under development as an imaging modality. Athough a number of physical factors have hindered its development, EPR's potential is quite promising in a number of important application areas, including in vivo oximetry. EPR images are generally reconstructed using a tomographic imaging technique, of which filtered backprojection (FBP) is the most commonly used. We apply two iterative methods for maximum-entropy image reconstruction in EPR. The first is the multiplicative algebraic reconstruction technique (MART), a well-known row-action method. We propose a second method, known as LSEnt (least-squares entropy), that maximizes entropy and performs regularization by maintaining a desired distance from the measurements. LSEnt is in part motivated by the barrier method of interior-point programming. We present studies in which images of two physical phantoms, reconstructed using FBP, MART, and LSEnt, are compared. The images reconstructed using MART and LSEnt have lower variance, better contrast recovery, subjectively better resolution, and reduced streaking artifact than those reconstructed using FBP. These results suggest that maximum-entropy reconstruction methods (particularly the more flexible LSEnt) may be critical in overcoming some of the physical challenges of EPR imaging.  相似文献   

8.
Spectral computed tomography (CT) has a great superiority in lesion detection, tissue characterization and material decomposition. To further extend its potential clinical applications, in this work, we propose an improved tensor dictionary learning method for low-dose spectral CT reconstruction with a constraint of image gradient ℓ0-norm, which is named as ℓ0TDL. The ℓ0TDL method inherits the advantages of tensor dictionary learning (TDL) by employing the similarity of spectral CT images. On the other hand, by introducing the ℓ0-norm constraint in gradient image domain, the proposed method emphasizes the spatial sparsity to overcome the weakness of TDL on preserving edge information. The split-bregman method is employed to solve the proposed method. Both numerical simulations and real mouse studies are perform to evaluate the proposed method. The results show that the proposed ℓ0TDL method outperforms other competing methods, such as total variation (TV) minimization, TV with low rank (TV+LR), and TDL methods.  相似文献   

9.
The more-dimensional bin packing problem (BPP) considered here requires packing a set of rectangular-shaped items into a minimum number of identical rectangular-shaped bins. All items may be rotated and the guillotine cut constraint has to be respected. A straightforward heuristic is presented that is based on a method for the container loading problem following a wall-building approach and on a method for the one-dimensional BPP. 1,800 new benchmark instances are introduced for the two-dimensional and three-dimensional BPP. The instances include more than 1,500 items on average. Applied to these very large instances, the heuristic generates solutions of acceptable quality in short computation times. Moreover, the influence of different instance parameters on the solution quality is investigated by an extended computational study.  相似文献   

10.
Following the work of Anily et?al., we consider a variant of bin packing called bin packing with general cost structures (GCBP) and design an asymptotic fully polynomial time approximation scheme (AFPTAS) for this problem. In the classic bin packing problem, a set of one-dimensional items is to be assigned to subsets of total size at most 1, that is, to be packed into unit sized bins. However, in GCBP, the cost of a bin is not 1 as in classic bin packing, but it is a non-decreasing and concave function of the number of items packed in it, where the cost of an empty bin is zero. The construction of the AFPTAS requires novel techniques for dealing with small items, which are developed in this work. In addition, we develop a fast approximation algorithm which acts identically for all non-decreasing and concave functions, and has an asymptotic approximation ratio of 1.5 for all functions simultaneously.  相似文献   

11.
The two-dimensional guillotine bin packing problem consists of packing, without overlap, small rectangular items into the smallest number of large rectangular bins where items are obtained via guillotine cuts. This problem is solved using a new guillotine bottom left (GBL) constructive heuristic and its agent-based (A–B) implementation. GBL, which is sequential, successively packs items into a bin and creates a new bin every time it can no longer fit any unpacked item into the current one. A–B, which is pseudo-parallel, uses the simplest system of artificial life. This system consists of active agents dynamically interacting in real time to jointly fill the bins while each agent is driven by its own parameters, decision process, and fitness assessment. A–B is particularly fast and yields near-optimal solutions. Its modularity makes it easily adaptable to knapsack related problems.  相似文献   

12.
One of main difficulties of multi-dimensional packing problems is the fragmentation of free space into several unusable small parts after a few items are packed. This study proposes a defragmentation technique to combine the fragmented space into a continuous usable space, which potentially allows the packing of additional items. We illustrate the effectiveness of this technique using the two- and three-dimensional bin packing problem, where the aim is to load all given items (represented by rectangular boxes) into the minimum number of identical bins. Experimental results based on well-known 2D and 3D bin packing data sets show that our defragmentation technique alone is able to produce solutions approaching the quality of considerably more complex meta-heuristic approaches for the problem. In conjunction with a bin shuffling strategy for incremental improvement, our resultant algorithm outperforms all leading meta-heuristic approaches based on the commonly used benchmark data by a significant margin.  相似文献   

13.
The sensitivity of histogram computation to the choice of a reference interval and number of bins can be attenuated by replacing the crisp partition on which the histogram is built by a fuzzy partition. This involves replacing the crisp counting process by a distributed (weighted) voting process. The counterpart to this low sensitivity is some confusion in the count values: a value of 10 in the accumulator associated with a bin can mean 10 observations in the bin or 40 observations near the bin. This confusion can bias the statistical decision process based on such a histogram. In a recent paper, we proposed a method that links the probability measure associated with any subset of the reference interval with the accumulator values of a fuzzy partition-based histogram. The method consists of transferring counts associated with each bin proportionally to its interaction with the considered subset. Two methods have been proposed which are called precise and imprecise pignistic transfer. Imprecise pignistic transfer accounts for the interactivity of two consecutive cells in order to propagate, in the estimated probability measure, counting confusion due to fuzzy granulation. Imprecise pignistic transfer has been conjectured to include precise pignistic transfer. The present article proposes a proof of this conjecture.  相似文献   

14.
X-ray imaging is the conventional method for diagnosing the orthopedic condition of a patient. Computerized Tomography(CT) scanning is another diagnostic method that provides patient’s 3D anatomical information. However, both methods have limitations when diagnosing the whole leg; X-ray imaging does not provide 3D information, and normal CT scanning cannot be performed with a standing posture. Obtaining 3D data regarding the whole leg in a standing posture is clinically important because it enables 3D analysis in the weight bearing condition. Based on these clinical needs, a hardware-based bi-plane X-ray imaging system has been developed; it uses two orthogonal X-ray images. However, such methods have not been made available in general clinics because of the hight cost. Therefore, we proposed a widely adaptive method for 2D X-ray image and 3D CT scan data. By this method, it is possible to threedimensionally analyze the whole leg in standing posture. The optimal position that generates the most similar image is the captured X-ray image. The algorithm verifies the similarity using the performance of the proposed method by simulation-based experiments. Then, we analyzed the internal-external rotation angle of the femur using real patient data. Approximately 10.55 degrees of internal rotations were found relative to the defined anterior-posterior direction. In this paper, we present a useful registration method using the conventional X-ray image and 3D CT scan data to analyze the whole leg in the weight-bearing condition.  相似文献   

15.
直方图理论与最优直方图制作   总被引:2,自引:0,他引:2       下载免费PDF全文
直方图是一种最为常见的密度估计和数据分析工具. 在直方图理论和制作过程中, 组距的选择和边界点的确定尤为重要. 然而, 许多学者对这两个参数的选择仍然采用经验的方法, 甚至现在大多数统计软件在确定直方图分组数时也是默认采用粗略的计算公式. 本文主要介绍直方图理论和最优直方图制作的最新研究成果, 强调面向样本的最优直方图制作方法.  相似文献   

16.
We consider the two-dimensional bin packing problem given a set of rectangular items, find the minimal number of rectangular bins needed to pack all items. Rotation of the items is not permitted. We show for any integer \({k} \ge 3\) that at most \({k}-1\) bins are needed to pack all items if every item fits into a bin and if the total area of items does not exceed \({k}/4\) -times the bin area. Moreover, this bound is tight. Furthermore, we show that only two bins are necessary to pack all items if the total area of items is not larger than the bin area, and if the height of each item is not larger than a third of the bin height and the width of every item does not exceed half of the bin width.  相似文献   

17.
The bin packing problem with conflicts (BPC) consists of minimizing the number of bins used to pack a set of items, where some items cannot be packed together in the same bin due to compatibility restrictions. The concepts of dual-feasible functions (DFF) and data-dependent dual-feasible functions (DDFF) have been used in the literature to improve the resolution of several cutting and packing problems. In this paper, we propose a general framework for deriving new DDFF as well as a new concept of generalized data-dependent dual-feasible functions (GDDFF), a conflict generalization of DDFF. The GDDFF take into account the structure of the conflict graph using the techniques of graph triangulation and tree-decomposition. Then we show how these techniques can be used in order to improve the existing lower bounds.  相似文献   

18.
Consider a balls‐in‐bins process in which each new ball goes into a given bin with probability proportional to f(n), where n is the number of balls currently in the bin and f is a fixed positive function. It is known that these so‐called balls‐in‐bins processes with feedback have a monopolistic regime: if f(x) = xp for p > 1, then there is a finite time after which one of the bins will receive all incoming balls. Our goal in this article is to quantify the onset of monopoly. We show that the initial number of balls is large and bin 1 starts with a fraction α > 1/2 of the balls, then with very high probability its share of the total number of balls never decreases significantly below α. Thus a bin that obtains more than half of the balls at a “large time” will most likely preserve its position of leadership. However, the probability that the winning bin has a non‐negligible advantage after n balls are in the system is ~const. × n1‐p, and the number of balls in the losing bin has a power‐law tail. Similar results also hold for more general functions f. © 2009 Wiley Periodicals, Inc. Random Struct. Alg., 2009  相似文献   

19.
The 2-constraint bin packing problem consists in packing a given number of items, each one characterised by two different but not related dimensions, into the minimum number of bins in such a way to do not exceed the capacity of the bins in either dimension. The development of the heuristics for this problem is challenged by the need of a proper definition of the criterion for evaluating the feasibility of the two capacity constraints on the two different dimensions. In this paper, we propose a computational evaluation of several criteria, and two simple but effective algorithms—a greedy and neighbourhood search algorithms—for solving the 2-constraint bin packing problem. An extensive computational analysis supports our main claim.  相似文献   

20.
The quality of cereal grains in storage will deteriorate toan unacceptable level if they are not kept dry and cool. Tomodel the drying and cooling process, an accurate knowledgeof the airflow distribution is required. In this paper, theequations used to model the air velocity are analysed. To study the flow of air through a typical drying system forstored grain, a two-dimensional rectangular bin is considered,with a single source of air on the bin floor. Two paths ofstudyare undertaken: the first is a linear analysis for low velocities,and following on from this is a nonlinear approach for largervelocities. The linear analysis is used to study a bin witha semi-infinite height, and the drying pattern is studied inthis bin using the air traverse time. Then bins with a finiteheight are analysed: it is shown that, for tall enough bins,the semi-infinite solution is accurate enough. A perturbationanalysis is used to study the semi-infinite bin when the airvelocity is too large for the linear analysis to be accu rate.It is shown that the effect of the nonlinearity is to move theair away from the high-velocity regions towards the areas oflower velocity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号