首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   30篇
  免费   0篇
化学   18篇
数学   2篇
物理学   10篇
  2013年   1篇
  2011年   1篇
  2009年   1篇
  2008年   1篇
  2007年   3篇
  2006年   2篇
  2005年   4篇
  2004年   1篇
  2002年   3篇
  2000年   1篇
  1999年   4篇
  1998年   1篇
  1997年   1篇
  1994年   1篇
  1993年   2篇
  1992年   1篇
  1988年   1篇
  1987年   1篇
排序方式: 共有30条查询结果,搜索用时 375 毫秒
1.
2.
An interlaboratory study was performed on behalf of the UK Food Standards Agency to evaluate the effectiveness of an immunoaffinity column cleanup liquid chromatographic (LC) method for the determination of deoxynivalenol in a variety of cereals and cereal products at proposed European regulatory limits. The test portion was extracted with water. The sample extract was filtered a applied to an immunoaffinity column. After being washed with water, the deoxynivalenol was eluted with acetonitrile or methanol. Deoxynivalenol was quantitated by reversed-phase LC with UV determination. Samples of artificially contaminated wheat-flour, rice flour, oat flour, polenta, and wheat based breakfast cereal, naturally contaminated wheat flour, and blank (very low level) samples of each matrix were sent to 13 collaborators in 7 European countries. Participants were asked to spike test portions of all samples at a range of deoxynivalenol concentrations equivalent to 200-2000 ng/g deoxynivalenol. Average recoveries ranged from 78 to 87%. Based on results for 6 artificially contaminated samples (blind duplicates), the relative standard deviation for repeatability (RSDr) ranged from 3.1 to 14.1%, and the relative standard deviation for reproducibility (RSDR) ranged from 11.5 to 26.3%. The method showed acceptable within-laboratory and between-laboratory precision for all 5 matrixes, as evidenced by HorRat values < 1.3.  相似文献   
3.
4.
Lyn JA  Ramsey MH  Damant AP  Wood R 《The Analyst》2007,132(12):1231-1237
Measurement uncertainty is a vital issue within analytical science. There are strong arguments that primary sampling should be considered the first and perhaps the most influential step in the measurement process. Increasingly, analytical laboratories are required to report measurement results to clients together with estimates of the uncertainty. Furthermore, these estimates can be used when pursuing regulation enforcement to decide whether a measured analyte concentration is above a threshold value. With its recognised importance in analytical measurement, the question arises of 'what is the most appropriate method to estimate the measurement uncertainty?'. Two broad methods for uncertainty estimation are identified, the modelling method and the empirical method. In modelling, the estimation of uncertainty involves the identification, quantification and summation (as variances) of each potential source of uncertainty. This approach has been applied to purely analytical systems, but becomes increasingly problematic in identifying all of such sources when it is applied to primary sampling. Applications of this methodology to sampling often utilise long-established theoretical models of sampling and adopt the assumption that a 'correct' sampling protocol will ensure a representative sample. The empirical approach to uncertainty estimation involves replicated measurements from either inter-organisational trials and/or internal method validation and quality control. A more simple method involves duplicating sampling and analysis, by one organisation, for a small proportion of the total number of samples. This has proven to be a suitable alternative to these often expensive and time-consuming trials, in routine surveillance and one-off surveys, especially where heterogeneity is the main source of uncertainty. A case study of aflatoxins in pistachio nuts is used to broadly demonstrate the strengths and weakness of the two methods of uncertainty estimation. The estimate of sampling uncertainty made using the modelling approach (136%, at 68% confidence) is six times larger than that found using the empirical approach (22.5%). The difficulty in establishing reliable estimates for the input variable for the modelling approach is thought to be the main cause of the discrepancy. The empirical approach to uncertainty estimation, with the automatic inclusion of sampling within the uncertainty statement, is recognised as generally the most practical procedure, providing the more reliable estimates. The modelling approach is also shown to have a useful role, especially in choosing strategies to change the sampling uncertainty, when required.  相似文献   
5.
Electrospray ionization Fourier transform ion cyclotron resonance mass spectrometry (ESI-FTICRMS) has been used to determine the mass of a double-stranded 500 base-pair (bp) polymerase chain reaction (PCR) product with an average theoretical mass of the blunt-ended (i.e. unadenylated) species of 308 859.35 Da. The PCR product was generated from the linearized bacteriophage Lambda genome which is a double-stranded template. Utilization of ethanol precipitation in tandem with a rapid microdialysis step to purify and desalt the PCR product was crucial to obtain a precise mass measurement. The PCR product (0.8 pmol/μL) was electrosprayed from a solution containing 75% acetonitrile, 25 mM piperidine, and 25 mM imidazole and was infused at a rate of 200 nL/min. The average molecular mass and the corresponding precision were determined using the charge-states ranging from 172 to 235 net negative charges. The experimental mass and corresponding precision (reported as the 95% confidence interval of the mean) was 309 406 +/- 27 Da (87 ppm). The mass accuracy was compromised due to the fact that the PCR generates multiple products when using Taq polymerase due to the non-template directed 3'-adenylation. This results in a mixture of three PCR products with nearly identical mass (i.e. blunt-ended, mono-adenylated and di-adenylated) with unknown relative abundances that were not resolved in the spectrum. Thus, the experimental mass will be a weighted average of the three species which, under our experimental conditions, reflects a nearly equal concentration of the mono- and di-adenylated species. This report demonstrates that precise mass measurements of PCR products up to 309 kDa (500 bp) can be routinely obtained by ESI-FTICR requiring low femtomole amounts. Copyright 1999 John Wiley & Sons, Ltd.  相似文献   
6.
A method has been developed to determine 11 phenolic antioxidants in the food simulants distilled water, 3% acetie acid, and 15% ethanol, using; micellar capillary electrophoresis (MCE). All the phenols could he analyzed within 35 min. The analytical recovery from spiked simulants was 80 to 119% except for 2,6-di-tert-butyl-4hydroxytoluenc (BHT) and octyl gallate, which could not be recovered from 3% acetic acid simulant. Calibration graph correlation coefficients for the 11 phenols were 0.982 to 0.999. Limits of detection (LoDs) were from 2.8 to 8.6 mg/L. These LoDs are well below European Union migration limits for these substances. It is concluded therefore that MCE offers a rapid and reliable analysis for the control of migration from plastics intended for food contact which employ these phenols as antioxidants.  相似文献   
7.
A recently proposed method of looking at sampling uncertainty has been tested by its application to the sampling and analysis of several types of food and an animal feedstuff. In this 'SAD' method, increments comprising the conventional sample (that is, collected in the fashion prescribed by the standard sampling protocol) are allocated to either of two equal sized 'splits', which are prepared and analysed separately. The absolute difference between the analytical results for the two splits (the split absolute difference, or SAD) is plotted on a one-sided control chart. A non-compliance indicates that the combined uncertainty of sampling or analysis is larger than expected and the result of the measurement (the mean of the two split results) is possibly not fit for purpose. In addition, the SAD results give rise to a rugged estimate the uncertainty associated with the sampling protocol, often a major part of the total measurement uncertainty.  相似文献   
8.
Thompson M  Owen L  Wilkinson K  Wood R  Damant A 《The Analyst》2002,127(12):1666-1668
Both the Kjeldahl and the Dumas methods for the determination of protein in foodstuffs are currently in use, but the empirical nitrogen factors used to convert the determined nitrogen content to protein content are based on the Kjeldahl method alone. Non-equivalence between the two methods could therefore result in some laboratories reporting an incorrect protein content. We report here a study using data accumulated over several years in the results of a proficiency testing scheme. On average the Dumas method provided results that were relatively higher by about 1.4% than the Kjeldahl method, but the difference between the methods depended on the type of foodstuff. The methodology of looking for bias between analytical methods is critically discussed.  相似文献   
9.
Lyn JA  Ramsey MH  Damant AP  Wood R 《The Analyst》2005,130(9):1271-1279
Uncertainty estimates from routine sampling and analytical procedures can be assessed as being fit for purpose using the optimised uncertainty (OU) method. The OU method recommends an optimal level of uncertainty that should be reached in order to minimise the expected financial loss, given a misclassification of a batch as a result of the uncertainty. Sampling theory can used as a predictive tool when a change in sampling uncertainty is recommended by the OU method. The OU methodology has been applied iteratively for the first time using a case study of wholesale butter and the determination of five quality indicators (moisture, fat, solids-not-fat (SNF), peroxide value (PV) and free fatty acid (FFA)). The sampling uncertainty (s(samp)) was found to be sub-optimal for moisture and PV determination, for 3-fold composite samples. A revised sampling protocol was devised using Gy's sampling theory. It was predicted that an increase in sample mass would reduce the sampling uncertainty to the optimal level, resulting in a saving in expectation of loss of over pounds 2000 per 20 tonne batch, when compared to current methods. Application of the optimal protocol did not however, achieve the desired reduction in s(samp) due to limitations in sampling theory. The OU methodology proved to be a useful tool in identifying broad weaknesses within a routine protocol and assessing fitness for purpose. However, the successful routine application of sampling theory, as part of the optimisation process, requires substantial prior knowledge of the sampling target.  相似文献   
10.
In the FAPAS proficiency testing scheme, participants are asked to state whether the analytical method used was accredited or not accredited. It is thus possible to compare the stated accreditation status with performance in the scheme. For this purpose, fifty qualifying examples of analyte-test material combination were selected at random from the reports from the year 2006. The accredited/non-accredited subsets of results from each example were subjected to a statistical analysis to determine whether any significant differences between the distributions of results could be detected. Outliers were removed from the datasets before the main statistical tests and considered separately. The inlying data were subjected to non-parametric tests for differences in central tendency and dispersion. A few significant examples were found, but could be reasonably attributed to chance. Among the inliers there were no grounds to reject the overall null hypothesis, that is, that accreditation has no effect on performance. However, the proportion of outliers was about twice as high among the non-accredited group.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号