909 resultados para SHORT-CONTACT TIMES
Resumo:
This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^
Resumo:
Establishing an association between the scent a perpetrator left at a crime scene to the odor of the suspect of that crime is the basis for the use of human scent identification evidence in a court of law. Law enforcement agencies gather evidence through the collection of scent from the objects that a perpetrator may have handled during the execution of the criminal act. The collected scent evidence is consequently presented to the canines for identification line-up procedures with the apprehended suspects. Presently, canine scent identification is admitted as expert witness testimony, however, the accurate behavior of the dogs and the scent collection methods used are often challenged by the court system. The primary focus of this research project entailed an evaluation of contact and non-contact scent collection techniques with an emphasis on the optimization of collection materials of different fiber chemistries to evaluate the chemical odor profiles obtained using varying environment conditions to provide a better scientific understanding of human scent as a discriminative tool in the identification of suspects. The collection of hand odor from female and male subjects through both contact and non-contact sampling approaches yielded new insights into the types of VOCs collected when different materials are utilized, which had never been instrumentally performed. Furthermore, the collected scent mass was shown to be obtained in the highest amounts for both gender hand odor samples on cotton sorbent materials. Compared to non-contact sampling, the contact sampling methods yielded a higher number of volatiles, an enhancement of up to 3 times, as well as a higher scent mass than non-contact methods by more than an order of magnitude. The evaluation of the STU-100 as a non-contact methodology highlighted strong instrumental drawbacks that need to be targeted for enhanced scientific validation of current field practices. These results demonstrated that an individual's human scent components vary considerably depending on the method used to collect scent from the same body region. This study demonstrated the importance of collection medium selection as well as the collection method employed in providing a reproducible human scent sample that can be used to differentiate individuals.
Resumo:
The accurate and reliable estimation of travel time based on point detector data is needed to support Intelligent Transportation System (ITS) applications. It has been found that the quality of travel time estimation is a function of the method used in the estimation and varies for different traffic conditions. In this study, two hybrid on-line travel time estimation models, and their corresponding off-line methods, were developed to achieve better estimation performance under various traffic conditions, including recurrent congestion and incidents. The first model combines the Mid-Point method, which is a speed-based method, with a traffic flow-based method. The second model integrates two speed-based methods: the Mid-Point method and the Minimum Speed method. In both models, the switch between travel time estimation methods is based on the congestion level and queue status automatically identified by clustering analysis. During incident conditions with rapidly changing queue lengths, shock wave analysis-based refinements are applied for on-line estimation to capture the fast queue propagation and recovery. Travel time estimates obtained from existing speed-based methods, traffic flow-based methods, and the models developed were tested using both simulation and real-world data. The results indicate that all tested methods performed at an acceptable level during periods of low congestion. However, their performances vary with an increase in congestion. Comparisons with other estimation methods also show that the developed hybrid models perform well in all cases. Further comparisons between the on-line and off-line travel time estimation methods reveal that off-line methods perform significantly better only during fast-changing congested conditions, such as during incidents. The impacts of major influential factors on the performance of travel time estimation, including data preprocessing procedures, detector errors, detector spacing, frequency of travel time updates to traveler information devices, travel time link length, and posted travel time range, were investigated in this study. The results show that these factors have more significant impacts on the estimation accuracy and reliability under congested conditions than during uncongested conditions. For the incident conditions, the estimation quality improves with the use of a short rolling period for data smoothing, more accurate detector data, and frequent travel time updates.
Resumo:
The growing need for fast sampling of explosives in high throughput areas has increased the demand for improved technology for the trace detection of illicit compounds. Detection of the volatiles associated with the presence of the illicit compounds offer a different approach for sensitive trace detection of these compounds without increasing the false positive alarm rate. This study evaluated the performance of non-contact sampling and detection systems using statistical analysis through the construction of Receiver Operating Characteristic (ROC) curves in real-world scenarios for the detection of volatiles in the headspace of smokeless powder, used as the model system for generalizing explosives detection. A novel sorbent coated disk coined planar solid phase microextraction (PSPME) was previously used for rapid, non-contact sampling of the headspace containers. The limits of detection for the PSPME coupled to IMS detection was determined to be 0.5-24 ng for vapor sampling of volatile chemical compounds associated with illicit compounds and demonstrated an extraction efficiency of three times greater than other commercially available substrates, retaining >50% of the analyte after 30 minutes sampling of an analyte spike in comparison to a non-detect for the unmodified filters. Both static and dynamic PSPME sampling was used coupled with two ion mobility spectrometer (IMS) detection systems in which 10-500 mg quantities of smokeless powders were detected within 5-10 minutes of static sampling and 1 minute of dynamic sampling time in 1-45 L closed systems, resulting in faster sampling and analysis times in comparison to conventional solid phase microextraction-gas chromatography-mass spectrometry (SPME-GC-MS) analysis. Similar real-world scenarios were sampled in low and high clutter environments with zero false positive rates. Excellent PSPME-IMS detection of the volatile analytes were visualized from the ROC curves, resulting with areas under the curves (AUC) of 0.85-1.0 and 0.81-1.0 for portable and bench-top IMS systems, respectively. Construction of ROC curves were also developed for SPME-GC-MS resulting with AUC of 0.95-1.0, comparable with PSPME-IMS detection. The PSPME-IMS technique provides less false positive results for non-contact vapor sampling, cutting the cost and providing an effective sampling and detection needed in high-throughput scenarios, resulting in similar performance in comparison to well-established techniques with the added advantage of fast detection in the field.
Resumo:
The period from 1874 to 1901 was a time of significant transition in the economic and political life of Newfoundland. Twenty years into responsible government and with Confederation on the backburner, the colony’s politicians turned their attention to economic diversification, landward development and carving out the island’s place in the British Empire. The period saw both economic prosperity and retrenchment; the construction of a trans-insular railway; the adoption of policies to foster agriculture, forestry, manufacturing and mining; and diplomatic efforts to resolve France’s outstanding claims on the northwest coast of the island. At the same time, the government made an attempt to intervene directly in its primary industry, the fisheries. It created a Fisheries Commission in 1889 that recommended conservation measures and artificial propagation as ways to restore the health of some of the island’s fish stocks. They also proposed new methods of curing, packaging and marketing Newfoundland’s cod, as well as a complete overhaul of the truck system. A major player in both the public and private debates surrounding all of these subjects was the Reverend Moses Harvey. Along with being minister of the Free Church of Scotland in St. John’s, Harvey was one of Newfoundland’s most active promoters in the late nineteenth century. He served as the media mouthpiece for both Prime Minister William Whiteway and Prime Minister Robert Thorburn; editing the Evening Mercury – the official organ of the Liberal Party and then the Reform Party – from 1882 to 1883 and 1885 until 1890. As well, Harvey wrote regular columns on Newfoundland issues for newspapers in London, New York, Boston, Montreal, Toronto, and Halifax. He also produced numerous books, articles, encyclopedia entries, and travel guides outlining the island’s attractions and its vast economic potential. In short, Harvey made a significant contribution in shaping the way residents and the outside world viewed Newfoundland during this period. This thesis examines late nineteenth-century Newfoundland through the writing of Moses Harvey. The biographical approach offers a fuller, more nuanced account of some of the major historical themes of the period including the politics of progress, opening up the interior, railway construction and attitudes toward the fisheries. It also provides an insider’s prospective on what led to some of the major political decisions, policy positions or compromises taken by the Whiteway and Thorburn governments. Finally, a more detailed review of Harvey’s work exposes the practical and political differences that he had with people like D.W. Prowse and Bishop Michael Howley. While these so-called “boomers” in Newfoundland’s historiography agreed on broad themes, they parted ways over what should be done with the fisheries and how best to channel the colony’s growing sense of nationalism.
Resumo:
Cold-water corals are amongst the most three-dimensionally complex deep-sea habitats known and are associated with high local biodiversity. Despite their importance as ecosystem engineers, little is known about how these organisms will respond to projected ocean acidification. Since preindustrial times, average ocean pH has already decreased from 8.2 to ~ 8.1. Predicted CO2 emissions will decrease this by up to another 0.3 pH units by the end of the century. This decrease in pH may have a wide range of impacts upon marine life, and in particular upon calcifiers such as cold-water corals. Lophelia pertusa is the most widespread cold-water coral (CWC) species, frequently found in the North Atlantic. Data here relate to a short term data set (21 days) on metabolism and net calcification rates of freshly collected L. pertusa from Mingulay Reef Complex, Scotland. These data from freshly collected L. pertusa from the Mingulay Reef Complex will help define the impact of ocean acidification upon the growth, physiology and structural integrity of this key reef framework forming species.
Resumo:
Metagenomic studies use high-throughput sequence data to investigate microbial communities in situ. However, considerable challenges remain in the analysis of these data, particularly with regard to speed and reliable analysis of microbial species as opposed to higher level taxa such as phyla. We here present Genometa, a computationally undemanding graphical user interface program that enables identification of bacterial species and gene content from datasets generated by inexpensive high-throughput short read sequencing technologies. Our approach was first verified on two simulated metagenomic short read datasets, detecting 100% and 94% of the bacterial species included with few false positives or false negatives. Subsequent comparative benchmarking analysis against three popular metagenomic algorithms on an Illumina human gut dataset revealed Genometa to attribute the most reads to bacteria at species level (i.e. including all strains of that species) and demonstrate similar or better accuracy than the other programs. Lastly, speed was demonstrated to be many times that of BLAST due to the use of modern short read aligners. Our method is highly accurate if bacteria in the sample are represented by genomes in the reference sequence but cannot find species absent from the reference. This method is one of the most user-friendly and resource efficient approaches and is thus feasible for rapidly analysing millions of short reads on a personal computer.
Resumo:
Methods to measure enteric methane (CH4) emissions from individual ruminants in their production environment are required to validate emission inventories and verify mitigation claims. Estimates of daily methane production (DMP) based on consolidated short-term emission measurements are developing, but method verification is required. Two cattle experiments were undertaken to test the hypothesis that DMP estimated by averaging multiple short-term breath measures of methane emission rate did not differ from DMP measured in respiration chambers (RC). Short-term emission rates were obtained from a GreenFeed Emissions Monitoring (GEM) unit, which measured emission rate while cattle consumed a dispensed supplement. In experiment 1 (Expt. 1), four non-lactating cattle (LW=518 kg) were adapted for 18 days then measured for six consecutive periods. Each period consisted of 2 days of ad libitum intake and GEM emission measurement followed by 1 day in the RC. A prototype GEM unit releasing water as an attractant (GEM water) was also evaluated in Expt. 1. Experiment 2 (Expt. 2) was a larger study based on similar design with 10 cattle (LW=365 kg), adapted for 21 days and GEM measurement was extended to 3 days in each of the six periods. In Expt. 1, there was no difference in DMP estimated by the GEM unit relative to the RC (209.7 v. 215.1 g CH4/day) and no difference between these methods in methane yield (MY, 22.7 v. 23.7 g CH4/kg of dry matter intake, DMI). In Expt. 2, the correlation between GEM and RC measures of DMP and MY were assessed using 95% confidence intervals, with no difference in DMP or MY between methods and high correlations between GEM and RC measures for DMP (r=0.85; 215 v. 198 g CH4/day SEM=3.0) and for MY (r=0.60; 23.8 v. 22.1 g CH4/kg DMI SEM=0.42). When data from both experiments was combined neither DMP nor MY differed between GEM- and RC-based measures (P>0.05). GEM water-based estimates of DMP and MY were lower than RC and GEM (P<0.05). Cattle accessed the GEM water unit with similar frequency to the GEM unit (2.8 v. 3.5 times/day, respectively) but eructation frequency was reduced from 1.31 times/min (GEM) to once every 2.6 min (GEM water). These studies confirm the hypothesis that DMP estimated by averaging multiple short-term breath measures of methane emission rate using GEM does not differ from measures of DMP obtained from RCs. Further, combining many short-term measures of methane production rate during supplement consumption provides an estimate of DMP, which can be usefully applied in estimating MY.
Resumo:
The ocean bottom pressure records from eight stations of the Cascadia array are used to investigate the properties of short surface gravity waves with frequencies ranging from 0.2 to 5 Hz. It is found that the pressure spectrum at all sites is a well-defined function of the wind speed U10 and frequency f, with only a minor shift of a few dB from one site to another that can be attributed to variations in bottom properties. This observation can be combined with the theoretical prediction that the ocean bottom pressure spectrum is proportional to the surface gravity wave spectrum E(f) squared, times the overlap integral I(f) which is given by the directional wave spectrum at each frequency. This combination, using E(f) estimated from modeled spectra or parametric spectra, yields an overlap integral I(f) that is a function of the local wave age inline image. This function is maximum for f∕fPM = 8 and decreases by 10 dB for f∕fPM = 2 and f∕fPM = 30. This shape of I(f) can be interpreted as a maximum width of the directional wave spectrum at f∕fPM = 8, possibly equivalent to an isotropic directional spectrum, and a narrower directional distribution toward both the dominant low frequencies and the higher capillary-gravity wave frequencies.
Resumo:
It is an Olympic year and we have just witnessed the fantastic games hosted by Rio de Janeiro. Well done to team USA for winning the most medals overall but also well done to so many other nations and individuals who performed so well or were ambassadors in other ways. Teenage swimmer Yusra Mardini who swam for the refugee team and South Africa's Wayde van Niekerk who broke the longstanding 400 m record of Michael Johnson that has stood since 1999. Of course, we must mention sprinter Usain Bolt and swimmer Michael Phelps, who have now transcended superstar status and entered a new level of icon. My personal highlight was the sportsmanship witnessed in the 5000 m when American Abbey D’Agostino was accidentally felled by New Zealand runner Nikki Hamblin. D’Agostino helped Hamblin back to her feet but slumped to the track after realising her own injury. Hamblin helped her up and stayed with her so that both completed the race. The International Olympic Committee has awarded both with the prestigious Pierre de Coubertin award, also known as the International Fair Play Trophy. Fair play is of paramount importance in publishing in peer-reviewed papers. At CLAE we try and maintain, as do other journals, this by ensuring double blind peer review and allowing authors to select the most appropriate handling editor for their submission. Our handling editors are placed across the world (2 in Europe, 1 in the Americas, 1 in Australia and 1 in Asia) and part of their role is to encourage submissions from their region. Over the last decade we certainly have seen more and more papers from places that haven’t previously published in CLAE. In this issue of CLAE we have a true international blend of papers. We have papers from authors from the UK, USA, Iran, Jordan, France, Poland, Turkey, Nigeria, France, Spain and Brazil. I think it's a testament to the continued success of the journal that we are attracting new writers from so many parts of the world and retain papers from more established authors and research centres. We do continue to attract many weaker papers that are rejected early in the review process. Often these will be unexceptional case reports or papers describing a surgical technique. Case reports are published but only those that offer something original and especially those with interesting photographs. In this issue you will see Professor James Wolffsohn (UK) has an interesting paper around a lot of the focus of his recent research activity into clinical evaluation of methods of correcting presbyopia. In this paper he highlights predictors to aid success of presbyopic contact lenses. If you have been involved in any clinical work or research in the field of dry eye disease then you will know well the CLDEQ (Contact Lens Dry Eye Questionnaire) devised by Robin Chalmers and her colleagues (USA). This issue of CLAE details the latest research using the CLDEQ-8 (the 8 item version of the CLDEQ). The Shahroud Eye Cohort Study has produced many papers already and in this issue we see Fotouhi Akbar (Iran) looking at changes in central and peripheral corneal thickness over a five year period. These days we use a lot of new instrumentation, such as optical low-coherence reflectometry. In this issue Emre Güler (Turkey) compares that to a new optical biometry unit. Dry eye is more common and in this issue we see a study by Oluyemi Fasina (Nigeria) to investigate the disease in adults in South-West Nigeria. The TearLab™ is now commonly used to investigate osmolarity and Dorota Szczesna-Iskander (Poland) looks at measurement variability of this device. Following the theme of dry eyes and tear testing Renaud Laballe (France) looks at the use of scleral lenses as a reservoir-based ocular therapeutic system. In this issue we have a couple of papers looking at different aspects of keratoconus. Magdalena Popiela (UK) looks at demographics of older keratoconic patients in Wales, Faik Orucoglu (Turkey) reports a novel scoring system for distinguishing keratoconus from normal eyes, Gonzalo Carracedo (Spain) reports the effect of rigid gas permeable lens wear on dry eye in keratoconus and Hatice Nur Colak (Turkey) compares topographic and aberrations in keratoconus. Other interesting papers you will find are Mera Haddad (Jordan) investigates contact lens prescribing in Jordan, Camilla Fraga Amaral (Brazil) offers a report on the use of ocular prosthetics, Naveed Ahmed Khan (Malaysia) reports of the use of dimethyl sulfoxide in contact lens disinfectant and Michael Killpartrick (UK) offers a short piece with some useful advice on contamination risk factors that may occur from the posterior surface of disposable lenses. So for this issue I would say that the Gold Medal for biggest contribution in terms of papers has to go to Turkey. I could have awarded it to the UK too, but Turkey has three full papers and the UK has two plus one short communication. Turkey is also one of the countries that has shown the largest increase in submissions over the last decade. Finally, welcome aboard to our newest Editorial Board Member Nicole Carnt from Australia. Nicole has been an active researcher for many years and acted as a reviewer for CLAE many times in the past. We look forward to working with you.
Resumo:
PURPOSE: To assess the relationship between short-term and long-term changes in power at different corneal locations relative to the change in central corneal power and the 2-year change in axial elongation relative to baseline in children fitted with orthokeratology contact lenses (OK). METHODS: Thirty-one white European subjects 6 to 12 years of age and with myopia −0.75 to −4.00 DS and astigmatism ≤1.00 DC were fitted with OK. Differences in refractive power 3 and 24 months post-OK in comparison with baseline and relative to the change in central corneal power were determined from corneal topography data in eight different corneal regions (i.e., N[nasal]1, N2, T[temporal]1, T2, I[inferior]1, I2, S[superior]1, S2), and correlated with OK-induced axial length changes at two years relative to baseline. RESULTS: After 2 years of OK lens wear, axial length increased by 0.48±0.18 mm (P0.05). CONCLUSION: The reduction in central corneal power and relative increase in paracentral and pericentral power induced by OK over 2 years were not significantly correlated with concurrent changes in axial length of white European children.