首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The Fermilab CKM (E921) experiment studies a rare kaon decay which has a very small branching ratio and can be very hard to separate from background processes.A trigger and DAQ system is required to collecto all necessary unformation for background rejection and to maintain high reliability at high beam rate.The unique challenges have emphasized the following guiding concepts:(1) Collecting background is as important as collecting good events.(2) A DAQ “event“ should not be just a “snap shot“ of the detector.It should be a short history record of the detector around the candidate event. The hit history provides information to understand temporary detector blindness,which is extremely important to the CKM experiment.(3) The main purpose of the trigger system should not be “knocking down trigger rate“ or “throwing out garbage events“ .Instead,it should classify the events and select appropriate data collecting straegies among various predefined ones for the given types of the events.The following methodologies are epmployed in the architecture to fulfill the experiment requirements without confronting unnecessary technical difficulties.(1) Continuous digitization near the detector elements is utilized to preserve the data quality.(2) The concept of minimum synchronization is adopted to eliminate the needs of time matching signal paths.(3) A global level 1 trigger performs coincident and veto functions using digital timing information to avoid problems due to signal degrading in long calbes.(4) The DAQ logic allows to collect chronicle records around the interesting events with different levels of detail of ADC information,so that very low energy particles in the veto systems can be best detected.(5) A re-programmable hardware trigger(L2.5)and a software trigger(L3) sitting in the DAQ stream are planned to perform data selection functioins based on full detector data with adjustability.  相似文献   

2.
Within the ATLAS experiment Trigger/DAQ and DCS are both logically and physically separated.Nevertheless there is a need to communicate.The initial problem definition and analysis suggested three subsystems the Trigger/DAQ DCS Communication (DDC) project should support the ability to :1.exchange data between Trigger/DAQ and DCS;2.send alarm messages from DCS to Trigger/DAQ;3.issue commands to DCS from Trigger/DAQ.Each subsystem is developed and implemented independently using a common software infrastructure.Among the various subsystems of the ATLAS Trigger/DAQ the Online is responsible for the control and configuration.It is the glue connecting the different systems such as data flow.level 1 and high-level triggers.The DDC uses the various Online components as an interface point on the Trigger/DAQ side with the PVSS II SCADA system on the DCS side and addresses issues such as partitioning,time stamps,event numbers,hierarchy,authorization and security,PVSS II is a commercial product chosen by CERN to be the SCADA system for all LHC experiments,Its API provides full access to its database,which is sufficient to implement the 3 subsystems of the DDC software,The DDC project adopted the Online Software Process,which recommends a basic software life-cycle:problem statement,analysis,design,implementation and testing.Each phase results in a corresponding document or in the case of the implementation and testing,a piece of code,Inspection and review take a major role in the Online software process,The DDC documents have been inspected to detect flaws and resulted in a improved quality.A first prototype of the DDC is ready and foreseen to be used at the test-beam during summer 2001.  相似文献   

3.
DSPs are widely used in data acquisition systems on neutron spectrometers at the IBR-2 pulsed reactor.In this report several electronic blocks,based on the DSP of the TMS 320CXXXX family by the TI firm and intended to solve different tasks in DAQ systems,are described.  相似文献   

4.
The Gamma Ray Array Detector (GRAD) is one subsystem of HIRFL-ETF (the External Target Facility (ETF) of the Heavy Ion Research Facility in Lanzhou (HIRFL)). It is capable of measuring the energy of gamma-rays with 1024 CsI scintillators in in-beam nuclear experiments. The GRAD trigger should select the valid events and reject the data from the scintillators which are not hit by the gamma-ray. The GRAD trigger has been developed based on the Field Programmable Gate Array (FPGAs) and PXI interface. It makes prompt trigger decisions to select valid events by processing the hit signals from the 1024 CsI scintillators. According to the physical requirements, the GRAD trigger module supplies 12-bit trigger information for the global trigger system of ETF and supplies a trigger signal for data acquisition (DAQ) system of GRAD. In addition, the GRAD trigger generates trigger data that are packed and transmitted to the host computer via PXI bus to be saved for off-line analysis. The trigger processing is implemented in the front-end electronics of GRAD and one FPGA of the GRAD trigger module. The logic of PXI transmission and reconfiguration is implemented in another FPGA of the GRAD trigger module. During the gamma-ray experiments, the GRAD trigger performs reliably and efficiently. The function of GRAD trigger is capable of satisfying the physical requirements.  相似文献   

5.
The CMS regional calorimeter trigger system detects signatures of electrons/photons,taus,jets,and missing and total transverse energy in a deadtinmess pipelined architecture .This system receives 7000 calorimeter tregger tower energies on 1.2 Gband digital copper cable serial links and processes them in a low-latency pipelined design using custom-built electronics.At the heart of the system is the Receiver Card which uses the new generation of gigabit ethernet receiver chips on a mezzanine card to convert serial data to parallel data before transmission on a 160 MHz backplane for further processing by cards that sum energies and identify electrons and jets.We describe the algorithms and hardware implementation,and summarize the simulation results that show that this system is capable of handling the rate requirements while triggering on physics signals with high efficiency.  相似文献   

6.
The photoneutron source(PNS, phase 1), an electron linear accelerator(linac)-based pulsed neutron facility that uses the time-of-flight(TOF) technique, was constructed for the acquisition of nuclear data from the Thorium Molten Salt Reactor(TMSR) at the Shanghai Institute of Applied Physics(SINAP). The neutron detector signal used for TOF calculation, with information on the pulse arrival time, pulse shape, and pulse height, was recorded by using a waveform digitizer(WFD). By using the pulse height and pulse-shape discrimination(PSD)analysis to identify neutrons and γ-rays, the neutron TOF spectrum was obtained by employing a simple electronic design, and a new WFD-based DAQ system was developed and tested in this commissioning experiment. The DAQ system developed is characterized by a very high efficiency with respect to millisecond neutron TOF spectroscopy.  相似文献   

7.
The BaBar data acquisition system(DAQ)transports data from the detector front end eletronics to short term disk storage.A monitoring application(VMON)has been developed to monitor the one hundred and ninety computers in the dataflow system.Performance information for each CPU is collected and multicast across the existing data transport network.The packets are currently collected by a single UNIX workstation and archived.A ROOT based GUI provides control and displays the DAQ performance in real time.The same GUI is reused to recover archived VMON data,VMON has been deployed and constantly monitors the BaBar dataflow system.It has been used for diagnostics and provides input to models projecting future performance.The application has no measurable impact on data taking ,responds instantaneously on the human timescale to requests for information display,and uses only 3% of a 300MHz Sun Ultra5 CPU.  相似文献   

8.
The ALICE experiment [1] at the Large Hadron Collider(LHC) at CERN will detect up to 20,000 particles in a single Pb-Pb event resulting in a data rate of -75 MByte/event,The event rate is limited by the bandwidth of the data storage system.Higher rates are possible by selecting interesting events and subevents (High Level trigger) or compressing the data efficiently with modeling techniques.Both require a fast parallel pattern recognition.One possible solution to process the detector data at such rates is a farm of clustered SMP nodes,based on off-the-shelf PCs,and connected by a high bandwidt,low latency network.  相似文献   

9.
In modern high energy and astrophysics experiments the variety of user requirements and the complexity of the problem domain often involve the collaboration of several software frameworks,and different components are responsible for providing the functionalities related to each domain.For instance,a common use case consists in studying the physics effects and the detector performance,resulting from primary events,in a given detector configuration,to evaluate the physics reach of the experiment or optimise the detector design,Such a study typically involves various components:simulation,Visualisation,Analysis and (interactive)User Interface.We focus on the design aspects of the collaboration of these frameworks and on the technologies that help to simplify the complex process of software design.  相似文献   

10.
Congestion control for packets sent on a network is important for DAQ systems that contain an event builder using switching network technologies.Quality of Service(QoS) is a technique for congestion control.Recent Linux releases provide QoS in the kernel to manage network traffic.We have analyzed the packet-loss and packet distribution for the event builder prototype of the Atlas TDAQ system.We used PC/Linux with Gigabit Ethernet network as the testbed.The result showed that QoS using CBQ and TBF eliminated packet loss on UDP/IP transfer while the UDP/IP transfer in best effort made lots of packet loss.The result also showed that the QoS overhead was small.We concluded that QoS on Linux performed efficiently in TCP/IP and UDP/IP and will have an important role of the Atlas TDAQ system.  相似文献   

11.
The HERA-B data acquisition and triggering systems make use of Linux PC farms for triggering and online event reconstruction.We present in this paper the requirements,implementation and performance of both PC farms.They have been fully working during the year 2000 detector and trigger commissioning run.  相似文献   

12.
The Alpha Magnetic Spectrometer(AMS)is an experiment to search in space for dark matter,missing matter and antimatter scheduled for installation on the International SPace Station(ISS) Alpha.AMS detector had precursive flight in June 1998 on board the space shuttle Discovery during STS91,More than 100M events been collected and analyzed.The detector will have another flight in the fall of year 2003 for more than three years on ISS.The data will be transmitted from ISS to NASA Marshall Space Flight Center(Huntsvile,Alabama)and then to MIT and CERN for processing and analysis,In this report we describe AMS software in particular conditions database and data processing software.  相似文献   

13.
Signal distortion due to the non-uniform response of the detector degrades the measurement accuracy of most metrology instruments. In this Letter, we report a newly developed calibration source system for reference-based non-uniformity correction using a laser source, a fiber, and a diffusive module. By applying the Monte Carlo simulation, we show that the transmittance of the system highly depends on the cavity reflection of the diffusive module. We also demonstrate the use of this system to achieve a flat field at a very low non-uniformity(less than 0.2%) with proper illumination intensity, which most costly commercial integrating sphere systems traditionally cannot provide.  相似文献   

14.
李亮  龚光华  曾鸣 《中国物理 C》2011,35(12):1139-1142
Silicon photomultipliers (SiPMs) have remarkable advantages for use in photo-detection. Compared with PMT, SiPM shows advantages of high gain, excellent time resolution, insensitivity to magnetic fields and a lower operating voltage. SiPMs from Hamamatsu are used in the electromagnetic calorimeter (ECAL) sub-detector in the Positron Electron Balloon Spectrometer (PEBS) experiment, a balloon-borne spectrometer experiment aiming at the precise measurement of the cosmic-ray positron fraction. This paper introduces the evaluation and test results of several SiPM detector types, the dedicated front-end application specific integrated circuit (ASIC) electronics and the design of the data acquisition system (DAQ) system.  相似文献   

15.
The reconstruction and subsequent particle identification is a challenge in a complex and a high luminosity environment such as those expected in the ATLAS detector at the LHC.The ATLAS software has chosen the object oriented paradigm and has recently migrated much of its software components developed earlier using procedural programming languages.The new software,which emphasizes on the separation between algorthms and data objects,has been successfully integrated in the broader ATLAS framework.We will present a status report of the reconstruction software summarizing the experiences gained in the migration of several software components.We will examine some of the components of the calorimeter software design,which include simulation of real-time detector effects and online environment,and strategies deployed for identification of particles.  相似文献   

16.
All-reflective optical systems,due to their material absorption and low refractive index,are used to create the most suitable devices in extreme ultraviolet lithography (EUVL).In this letter,we present a design for an all-reflective lithographic projection lens.We also discuss its design idea and structural system.After analysis of the four-mirror optical system,the initial structural parameters are determined,the optical system is optimized,and the tolerances of the system are analyzed.We also show the implementation of optimal layout and desired imaging performance.  相似文献   

17.
We design a novel X-ray image detector by lens coupling a Gd_2O_2S:Tb intensifying screen with a high performance low-light-level (L~3,which often means luminescence less than 10~(-3) Lux) image intensifier. Different coupling effects on imaging performance between zoom lens and fix-focus lens are analyzed theoretically.In experiment,for designing a detector of 15-inch visual field,the system coupled by zoom lens is of 12.25-lp/cm resolution,while the one with fix-focus lens is 10 lp/cm.The superiority of zoom lens is validated.It is concluded that zoom lens preserves the image information better than fix-focus lens and improves the imaging system's performance in this design,which is referential to the design of other optical imaging systems.  相似文献   

18.
Parabolic trough collectors generate thermal energy from solar energy. Especially, they are very convenient for applications in high temperature solar power systems. To determine the design parameters, parabolic trough collectors must be analysed with optical analysis. In addition, thermodynamics (energy and exergy) analysis in the development of an energy efficient system must be achieved. Solar radiation passes through Earth's atmosphere until it reaches on Earth's surface and is focused from the parabolic trough collector to the tube receiver with a transparent insulated envelope. A11 of them constitute a complex mechanism. We investigate the geometry of parabolic trough reflector and characteristics of solar radiation to the reflecting surface through Earth's atmosphere, and calculate the collecting totM energy in the receiver. The parabolic trough collector, of which design parameters are given, is analysed in regard to the energy and exergy analysis considering the meteorological specification in May, June, July and August in 1sparta/TUrkey, and the results are presented.  相似文献   

19.
20.
This article introduces the design and performance of the data acquisition system used in an omnidirectional gamma-ray positioning system, along with a new method used in this system to obtain the position of radiation sources in a large field. This data acquisition system has various built-in interfaces collecting, in real time, information from the radiation detector, the video camera and the GPS positioning module. Experiments show that the data acquisition system is capable of carrying out the proposed quantitative analysis to derive the position of radioactive sources, which also satisfies the requirements of high stability and reliability.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号