首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Reconstruction of 3D curves from their stereo images is an important issue in computer vision. Based on deformation of the snake model and NURBS representation, we evolve the curve in the view of inverse optimization to finish reconstruction. This manner can reduce the need of matching multi-view space curve projections, meanwhile improve the reconstruction precision. Considering that the 2D data reconstruction exists error inevitably, based on two cameras, a discussion on its influence to stereo reconstruction is given next. Finally, the proposed approach is experimented with artificial and real data, and gains a satisfying reconstruction effect.  相似文献   

2.
Reconstruction of 3D curves from their stereo images is an important issue in computer vision. Based on deformation of the snake model and NURBS representation, we evolve the curve in the view of inverse optimization to finish reconstruction. This manner can reduce the need of matching multi-view space curve projections, meanwhile improve the reconstruction precision. Considering that the 2D data reconstruction exists error inevitably, based on two cameras, a discussion on its influence to stereo reconstruction is given next. Finally, the proposed approach is experimented with artificial and real data, and gains a satisfying reconstruction effect.  相似文献   

3.
This paper describes an interactive 3D animation approach that may help with the investigation, understanding and interpretation of results for a Visual Interactive Simulation (VIS) project. The method uses a graphics algorithm to draw on a computer screen ‘3D contour maps’ which show the response of a simulation model to changes in its input parameters. A previous paper showed that it is possible for a neural network to learn the response of a simulation. This paper shows that the speed of response of a neural network can be exploited so that 3D animation sequences of the simulation's results can be produced. It is possible to rotate the ‘3D contour map’, zoom, pan, or generally view the projection from different perspectives. Two example problems are described. The paper suggests that this approach can further improve the quality of VIS by being able to give comprehensive graphic 3D sensitivity analyses to the original problem under investigation.  相似文献   

4.
Full waveform inversion (FWI) of seismic traces recorded at the free surface allows the reconstruction of the physical parameters structure on the underlying medium. For such a reconstruction, an optimization problem is defined, where synthetic traces, obtained through numerical techniques as finite-difference or finite-element methods in a given model of the subsurface, should match the observed traces. The number of data samples is routinely around 1 billion for 2D problems and 1 trillion for 3D problems while the number of parameters ranges from 1 million to 10 million degrees of freedom. Moreover, if one defines the mismatch as the standard least-squares norm between values sampled in time/frequency and space, the misfit function has a significant number of secondary minima related to the ill-posedness and the nonlinearity of the inversion problem linked to the so-called cycle skipping. Taking into account the size of the problem, we consider a local linearized method where gradient is computed using the adjoint formulation of the seismic wave propagation problem. Starting for an initial model, we consider a quasi-Newtonian method, which allows us to formulate the reconstruction of various parameters such as P and S waves velocities or density or attenuation factors. A hierarchical strategy based on the incremental increase of the data complexity starting from low-frequency content to high-frequency content, from initial wavelets to later phases in the data space from narrow azimuths to wide azimuths and from simple observables to more complex ones. Different synthetic examples on realistic structures illustrate the efficiency of this strategy based on the data manipulation. This strategy related to the data space has to be inserted into a more global framework where we could improve significantly the probability to converge to the global minimum. When considering the model space, we may rely on the construction of the initial model or add constraints such as smoothness of the searched model and/or prior informations collected by other means. An alternative strategy concerns the building of the objective function and various possibilities must be considered, which may increase the linearity of the inversion procedure.  相似文献   

5.
For the first time, the inverse Sturm–Liouville problem with nonseparated boundary conditions is studied on a star-shaped geometric graph with three edges. It is shown that the Sturm–Liouville problem with general boundary conditions cannot be uniquely reconstructed from four spectra. Nonseparated boundary conditions are found for which a uniqueness theorem for the solution of the inverse Sturm–Liouville problem is proved. The spectrum of the boundary value problem itself and the spectra of three auxiliary problems are used as reconstruction data. It is also shown that the Sturm–Liouville problem with these nonseparated boundary conditions can be uniquely recovered if three spectra of auxiliary problems are used as reconstruction data and only five of its eigenvalues are used instead of the entire spectrum of the problem.  相似文献   

6.
Jürgen Frikel 《PAMM》2011,11(1):847-848
We investigate the reconstruction problem for limited angle tomography. Such problems arise naturally in applications like digital breast tomosynthesis, dental tomography, etc. Since the acquired tomographic data is highly incomplete, the reconstruction problem is severely ill-posed and the traditional reconstruction methods, such as filtered backprojection (FBP), do not perform well in such situations. To stabilize the inversion we propose the use of a sparse regularization technique in combination with curvelets. We argue that this technique has the ability to preserve edges. As our main result, we present a characterization of the kernel of the limited angle Radon transform in terms of curvelets. Moreover, we characterize reconstructions which are obtained via curvelet sparse regularizations at a limited angular range. As a result, we show that the dimension of the limited angle problem can be significantly reduced in the curvelet domain. (© 2011 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

7.
Surface reconstruction from scattered data is an important problem in such areas as reverse engineering and computer aided design.In solving partial differential equations derived from surface reconstruction problems,level-set method has been successfully used.We present in this paper a theoretical analysis on the existence and uniqueness of the solution of a partial differential equation derived from a model of surface reconstruction using the level-set approach.We give the uniqueness analysis of the cl...  相似文献   

8.
One of the most significant achievements from theoretical computer science was to show that there are noncomputable problems, which cannot be solved through algorithms. Although the formulation of such problems is mathematical, they often can be interpreted as problems derived from other fields, like physics or computer science. However, no non‐computable problem with economical or financial inspiration has been presented before. Here, we study the problem of valuation: given some adequate data, find the value of an asset. Valuation is modeled mathematically by the discounted cash flow operator. We show, using surprisingly simple arguments, that this operator is not computable. As theoretically, financial markets should trade assets based on their fair value, our result suggests that unpredictability of such markets may partially stem from inherent noncomputable behavior. A discussion of this result is also included. © 2012 Wiley Periodicals, Inc. Complexity, 2012  相似文献   

9.
In this article, we study three interconnected inverse problems in shift invariant spaces: 1) the convolution/deconvolution problem; 2) the uniformly sampled convolution and the reconstruction problem; 3) the sampled convolution followed by sampling on irregular grid and the reconstruction problem. In all three cases, we study both the stable reconstruction as well as ill-posed reconstruction problems. We characterize the convolutors for stable deconvolution as well as those giving rise to ill-posed deconvolution. We also characterize the convolutors that allow stable reconstruction as well as those giving rise to ill-posed reconstruction from uniform sampling. The connection between stable deconvolution, and stable reconstruction from samples after convolution is subtle, as will be demonstrated by several examples and theorems that relate the two problems.  相似文献   

10.
The objective of this paper is exploring implementation of a realistic images reconstruction 3D using geometric algebra (GA). We illustrate the suitability of GA for representing structures and developing algorithms in computer graphics, especially for engineering applications as 3D images modeling. A first consequence is to propose an efficient framework model to be implemented in hardware programmable. The obtained results showed that using GA, the computations are less complex and shows as simple computations geometrical operations. The obtained model to hardware can be implemented as a next step in 3D image reconstruction. We also include the potential of GA for optimizations and highly efficient implementations.  相似文献   

11.
The Quadratic Assignment Problem (QAP) is known as one of the most difficult problems within combinatorial optimization. It is used to model many practical problems including different layout problems. The main topic of this paper is to provide methods to check whether a particular instance of the QAP is a layout problem. An instance is a layout problem if the distances of the objects can be reconstructed on the plane and/or in the 3-dimensional space. A new mixed integer programming model is suggested for the case if the distances of the objects are supposed to be rectilinear distances. If the distances are Euclidean distances then the use of the well-known Multi-Dimensional Scaling (MDS) method of statistics is suggested for reconstruction purposes. The well-known difficulty of QAP makes it a popular and suitable experimental field for many algorithmic ideas including artificial intelligence methods. These types of results are published sometimes as layout problems. The methods of reconstruction can be used to decide whether the topic of a paper is layout or only general QAP. The issue what the OR community should expect from AI based algorithms, is also addressed.  相似文献   

12.
It has been known for many years that an optimal discrete nonlinear filter may be synthesized for systems whose plant dynamics, sensor characteristics and signal statistics are known by applying Bayes' Rule to sequentially update the conditional probability density function from the latest data. However, it was not until 1969 that a digital computer algorithm implementing the theory for a one-state variable one-step predictor appeared in the literature. This delay and the continuing scarcity of multidimensional nonlinear filters result from the overwhelming computational task which leads to unrealistic data processing times. For many nonlinear filtering problems analog and digital computers (a hybrid computation) combine to yield a higher data rate than can be obtained by con¬ventional digital methods. This paper describes an implementation of the theory by means of a hybrid computer algorithm for the optimal nonlinear one-step predictor.

The hybrid computer algorithm presented reduces the overall solution time per prediction because:

1) Many large computations of identical form are executed on the analog computer in parallel.

2) The discrete running variable in the digital algorithm may be replaced with a continuous analog computer variable in one or more dimensions leading to increased computational speed and finer resolution of the exponential transformation.

3) The modern analog computer is well suited to generate functions such as the expo¬nential at high speed with modest equipment.

4) The arithmetic, storage, and control functions performed rapidly by the digital computer are utilized without introducing extensive auxiliary calculations.

To illustrate pertinent aspects of the algorithm developed, the scalar cubed sensor problem previously described by Bucy is treated extensively. The hybrid algorithm is described. Problems associated with partitioning of equations between analog and digital computers, machine representations of variables, setting of initial conditions and floating of grid base are discussed. The effects of analog component bandwidths, digital-to-analog and analog-to-digital conversion times, analog computer mode switching times and digital computer I/O data rates on overall processing time are examined. The effect of limited analog computer dynamic range on accuracy is discussed. Results from a simulation of this optimal predictor using MOBSSL, a continuous system simulation language, are given. Timing estimates are presented and compared against similar estimates for the all digital algorithm.

For example, given a four-state variable optimal 1-step predictor utilizing 7 discrete points in each dimension, the hybrid algorithm can be used to generate predictions accurate to 2 decimal places once every 10 seconds. An analog computer complement of 250 integra¬tors and multipliers and a high-speed 3rd generation digital computer such as the CDC 6600 or IBM 360/85 are required. This compares with a lower bound of about 3 seconds per all digital prediction which would require 49 CDC 6600's operating in parallel. Analytical and simulation work quantifying errors in one state variable filters is presented. Finally, the use of an interactive graphic system for real time display and for filter evaluation is described.  相似文献   

13.
Packing and covering problems for metric spaces, and graphs in particular, are of essential interest in combinatorics and coding theory. They are formulated in terms of metric balls of vertices. We consider a new problem in graph theory which is also based on the consideration of metric balls of vertices, but which is distinct from the traditional packing and covering problems. This problem is motivated by applications in information transmission when redundancy of messages is not sufficient for their exact reconstruction, and applications in computational biology when one wishes to restore an evolutionary process. It can be defined as the reconstruction, or identification, of an unknown vertex in a given graph from a minimal number of vertices (erroneous or distorted patterns) in a metric ball of a given radius r around the unknown vertex. For this problem it is required to find minimum restrictions for such a reconstruction to be possible and also to find efficient reconstruction algorithms under such minimal restrictions.In this paper we define error graphs and investigate their basic properties. A particular class of error graphs occurs when the vertices of the graph are the elements of a group, and when the path metric is determined by a suitable set of group elements. These are the undirected Cayley graphs. Of particular interest is the transposition Cayley graph on the symmetric group which occurs in connection with the analysis of transpositional mutations in molecular biology [P.A. Pevzner, Computational Molecular Biology: An Algorithmic Approach, MIT Press, Cambridge, MA, 2000; D. Sankoff, N. El-Mabrouk, Genome rearrangement, in: T. Jiang, T. Smith, Y. Xu, M.Q. Zhang (Eds.), Current Topics in Computational Molecular Biology, MIT Press, 2002]. We obtain a complete solution of the above problems for the transposition Cayley graph on the symmetric group.  相似文献   

14.
An approach for reconstructing tomographic images based on the idea of continuous dynamical methods is presented. The method consists of a continuous-time image reconstruction (CIR) system described by differential equations for solving linear inverse problems. We theoretically demonstrate that the trajectories converge to a least squares solution to the linear inverse problem. An implementation of its equivalent electronic circuit is significantly faster than conventional discrete-time image reconstruction (DIR) systems executed in a digital computer. Moreover, the merits of our CIR are demonstrated on a tomographic inverse problem where simulated noisy projection data are generated from a known phantom. Here, we numerically demonstrate that the CIR system does not produce unphysical negative pixel values if one starts out with positive initial values. Besides, CIR also recovers the phantom with almost the same quality as DIR images.  相似文献   

15.
We recently proposed in [Cheng, XL et al. A novel coupled complex boundary method for inverse source problems Inverse Problem 2014 30 055002] a coupled complex boundary method (CCBM) for inverse source problems. In this paper, we apply the CCBM to inverse conductivity problems (ICPs) with one measurement. In the ICP, the diffusion coefficient q is to be determined from both Dirichlet and Neumann boundary data. With the CCBM, q is sought such that the imaginary part of the solution of a forward Robin boundary value problem vanishes in the problem domain. This brings in advantages on robustness and computation in reconstruction. Based on the complex forward problem, the Tikhonov regularization is used for a stable reconstruction. Some theoretical analysis is given on the optimization models. Several numerical examples are provided to show the feasibility and usefulness of the CCBM for the ICP. It is illustrated that as long as all the subdomains share some portion of the boundary, our CCBM-based Tikhonov regularization method can reconstruct the diffusion parameters stably and effectively.  相似文献   

16.
Surface reconstruction is very important for surface characterization and graph processing. Radial basis function has now become a popular method to reconstruct 3D surfaces from scattered data. However, it is relatively inaccurate at the boundary region. To solve this problem, a circle of new centres are added outside the domain of interest. The factors that influence the boundary behaviour are analyzed quantitatively via numerical experiments. It is demonstrated that if the new centres are properly located, the boundary problem can be effectively overcome whilst not reducing the accuracy at the interior area. A modified Graham scan technique is introduced to obtain the boundary points from a scattered point set. These boundary points are extended outside with an appropriate distance, and then uniformized to form the new auxiliary centres.   相似文献   

17.
Surface reconstruction from large unorganized data sets is very challenging, especially if the data present undesired holes. This is usually the case when the data come from laser scanner 3D acquisitions or if they represent damaged objects to be restored. An attractive field of research focuses on situations in which these holes are too geometrically and topologically complex to fill using triangulation algorithms. In this work a local approach to surface reconstruction from point-clouds based on positive definite Radial Basis Functions (RBF) is presented that progressively fills the holes by expanding the neighbouring information. The method is based on the algorithm introduced in [7] which has been successfully tested for the smooth multivariate interpolation of large scattered data sets. The local nature of the algorithm allows for real time handling of large amounts of data, since the computation is limited to suitable small areas, thus avoiding the critical efficiency problem involved in RBF multivariate interpolation. Several tests on simulated and real data sets demonstrate the efficiency and the quality of the reconstructions obtained using the proposed algorithm. AMS subject classification 65D17, 65D05, 65Y20  相似文献   

18.
A certain regularization technique for contact problems leads to a family of problems that can be solved efficiently using infinite-dimensional semismooth Newton methods, or in this case equivalently, primal–dual active set strategies. We present two procedures that use a sequence of regularized problems to obtain the solution of the original contact problem: first-order augmented Lagrangian, and path-following methods. The first strategy is based on a multiplier-update, while path-following with respect to the regularization parameter uses theoretical results about the path-value function to increase the regularization parameter appropriately. Comprehensive numerical tests investigate the performance of the proposed strategies for both a 2D as well as a 3D contact problem.  相似文献   

19.
The problem of deriving the structure of a non-deterministic system from its behaviour is a difficult one even when that behaviour is itself well-defined. When the behaviour can be described only in fuzzy terms structural inference may appear virtually impossible. However, a rigorous formulation and solution of the problem for stochastic automata has recently been given [1] and, in this paper, the results are extended to fuzzy stochastic automata and grammars. The results obtained are of interest on a number of counts. (1) They are a further step towards an integrated ‘theory of uncertainty’; (2) They give new insights into problems of inductive reasoning and processes of ‘precisiation’; (3) They are algorithmic and have been embodied in a computer program that can be applied to the modelling of sequential fuzzy data; (4) They demonstrate that sequential fuzzy data may be modelled naturally in terms of ‘possibility’ vectors.  相似文献   

20.
Natural systems are typically nonlinear and complex, and it is of great interest to be able to reconstruct a system in order to understand its mechanism, which cannot only recover nonlinear behaviors but also predict future dynamics. Due to the advances of modern technology, big data becomes increasingly accessible and consequently the problem of reconstructing systems from measured data or time series plays a central role in many scientic disciplines. In recent decades, nonlinear methods rooted in state space reconstruction have been developed, and they do not assume any model equations but can recover the dynamics purely from the measured time series data. In this review, the development of state space reconstruction techniques will be introduced and the recent advances in systems prediction and causality inference using state space reconstruction will be presented. Particularly, the cutting-edge method to deal with short-term time series data will be focused on. Finally, the advantages as well as the remaining problems in this field are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号