11 resultados para Online Analytical Processing

em Digital Commons at Florida International University


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tall buildings are wind-sensitive structures and could experience high wind-induced effects. Aerodynamic boundary layer wind tunnel testing has been the most commonly used method for estimating wind effects on tall buildings. Design wind effects on tall buildings are estimated through analytical processing of the data obtained from aerodynamic wind tunnel tests. Even though it is widely agreed that the data obtained from wind tunnel testing is fairly reliable the post-test analytical procedures are still argued to have remarkable uncertainties. This research work attempted to assess the uncertainties occurring at different stages of the post-test analytical procedures in detail and suggest improved techniques for reducing the uncertainties. Results of the study showed that traditionally used simplifying approximations, particularly in the frequency domain approach, could cause significant uncertainties in estimating aerodynamic wind-induced responses. Based on identified shortcomings, a more accurate dual aerodynamic data analysis framework which works in the frequency and time domains was developed. The comprehensive analysis framework allows estimating modal, resultant and peak values of various wind-induced responses of a tall building more accurately. Estimating design wind effects on tall buildings also requires synthesizing the wind tunnel data with local climatological data of the study site. A novel copula based approach was developed for accurately synthesizing aerodynamic and climatological data up on investigating the causes of significant uncertainties in currently used synthesizing techniques. Improvement of the new approach over the existing techniques was also illustrated with a case study on a 50 story building. At last, a practical dynamic optimization approach was suggested for tuning structural properties of tall buildings towards attaining optimum performance against wind loads with less number of design iterations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tall buildings are wind-sensitive structures and could experience high wind-induced effects. Aerodynamic boundary layer wind tunnel testing has been the most commonly used method for estimating wind effects on tall buildings. Design wind effects on tall buildings are estimated through analytical processing of the data obtained from aerodynamic wind tunnel tests. Even though it is widely agreed that the data obtained from wind tunnel testing is fairly reliable the post-test analytical procedures are still argued to have remarkable uncertainties. This research work attempted to assess the uncertainties occurring at different stages of the post-test analytical procedures in detail and suggest improved techniques for reducing the uncertainties. Results of the study showed that traditionally used simplifying approximations, particularly in the frequency domain approach, could cause significant uncertainties in estimating aerodynamic wind-induced responses. Based on identified shortcomings, a more accurate dual aerodynamic data analysis framework which works in the frequency and time domains was developed. The comprehensive analysis framework allows estimating modal, resultant and peak values of various wind-induced responses of a tall building more accurately. Estimating design wind effects on tall buildings also requires synthesizing the wind tunnel data with local climatological data of the study site. A novel copula based approach was developed for accurately synthesizing aerodynamic and climatological data up on investigating the causes of significant uncertainties in currently used synthesizing techniques. Improvement of the new approach over the existing techniques was also illustrated with a case study on a 50 story building. At last, a practical dynamic optimization approach was suggested for tuning structural properties of tall buildings towards attaining optimum performance against wind loads with less number of design iterations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3D geographic information system (GIS) is data and computation intensive in nature. Internet users are usually equipped with low-end personal computers and network connections of limited bandwidth. Data reduction and performance optimization techniques are of critical importance in quality of service (QoS) management for online 3D GIS. In this research, QoS management issues regarding distributed 3D GIS presentation were studied to develop 3D TerraFly, an interactive 3D GIS that supports high quality online terrain visualization and navigation. ^ To tackle the QoS management challenges, multi-resolution rendering model, adaptive level of detail (LOD) control and mesh simplification algorithms were proposed to effectively reduce the terrain model complexity. The rendering model is adaptively decomposed into sub-regions of up-to-three detail levels according to viewing distance and other dynamic quality measurements. The mesh simplification algorithm was designed as a hybrid algorithm that combines edge straightening and quad-tree compression to reduce the mesh complexity by removing geometrically redundant vertices. The main advantage of this mesh simplification algorithm is that grid mesh can be directly processed in parallel without triangulation overhead. Algorithms facilitating remote accessing and distributed processing of volumetric GIS data, such as data replication, directory service, request scheduling, predictive data retrieving and caching were also proposed. ^ A prototype of the proposed 3D TerraFly implemented in this research demonstrates the effectiveness of our proposed QoS management framework in handling interactive online 3D GIS. The system implementation details and future directions of this research are also addressed in this thesis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reflects a research project on the influence of online news media (from print, radio, and televised outlets) on disaster response. Coverage on the October 2010 Indonesian tsunami and earthquake was gathered from 17 sources from October 26 through November 30. This data was analyzed quantitatively with respect to coverage intensity over time and among outlets. Qualitative analyses were also conducted using keywords and value scale that assessed the degree of positivity or negativity associated with that keyword in the context of accountability. Results yielded insights into the influence of online media on actors' assumption of accountability and quality of response. It also provided information as to the optimal time window in which advocates and disaster management specialists can best present recommendations to improve policy and raise awareness. Coverage of outlets was analyzed individually, in groups, and as a whole, in order to discern behavior patterns for a better understanding of media interdependency. This project produced analytical insights but is primarily intended as a prototype for more refined and extensive research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

E-commerce is an approach to achieving business goals through information technology and is quickly changing the way hospitality business is planned, monitored, and conducted. No longer do buyers and sellers need to engage in interpersonal communications for transactions to occur. The future of transaction processing, which includes cyber cash and digital checking, are directly attributable to e-commerce which provides and efficient, reliable, secure, and effective platform for conducting hospitality business on the web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Sucralose has gained popularity as a low calorie artificial sweetener worldwide. Due to its high stability and persistence, sucralose has shown widespread occurrence in environmental waters, at concentrations that could reach up to several μg/L. Previous studies have used time consuming sample preparation methods (offline solid phase extraction/derivatization) or methods with rather high detection limits (direct injection) for sucralose analysis. This study described a faster and sensitive analytical method for the determination of sucralose in environmental samples. Results An online SPE-LC–MS/MS method was developed, being capable to quantify sucralose in 12 minutes using only 10 mL of sample, with method detection limits (MDLs) of 4.5 ng/L, 8.5 ng/L and 45 ng/L for deionized water, drinking and reclaimed waters (1:10 diluted with deionized water), respectively. Sucralose was detected in 82% of the reclaimed water samples at concentrations reaching up to 18 μg/L. The monthly average for a period of one year was 9.1 ± 2.9 μg/L. The calculated mass loads per capita of sucralose discharged through WWTP effluents based on the concentrations detected in wastewaters in the U. S. is 5.0 mg/day/person. As expected, the concentrations observed in drinking water were much lower but still relevant reaching as high as 465 ng/L. In order to evaluate the stability of sucralose, photodegradation experiments were performed in natural waters. Significant photodegradation of sucralose was observed only in freshwater at 254 nm. Minimal degradation (<20%) was observed for all matrices under more natural conditions (350 nm or solar simulator). The only photolysis product of sucralose identified by high resolution mass spectrometry was a de-chlorinated molecule at m/z 362.0535, with molecular formula C12H20Cl2O8. Conclusions Online SPE LC-APCI/MS/MS developed in the study was applied to more than 100 environmental samples. Sucralose was frequently detected (>80%) indicating that the conventional treatment process employed in the sewage treatment plants is not efficient for its removal. Detection of sucralose in drinking waters suggests potential contamination of surface and ground waters sources with anthropogenic wastewater streams. Its high resistance to photodegradation, minimal sorption and high solubility indicate that sucralose could be a good tracer of anthropogenic wastewater intrusion into the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Routine monitoring of environmental pollution demands simplicity and speed without sacrificing sensitivity or accuracy. The development and application of sensitive, fast and easy to implement analytical methodologies for detecting emerging and traditional water and airborne contaminants in South Florida is presented. A novel method was developed for quantification of the herbicide glyphosate based on lyophilization followed by derivatization and simultaneous detection by fluorescence and mass spectrometry. Samples were analyzed from water canals that will hydrate estuarine wetlands of Biscayne National Park, detecting inputs of glyphosate from both aquatic usage and agricultural runoff from farms. A second study describes a set of fast, automated LC-MS/MS protocols for the analysis of dioctyl sulfosuccinate (DOSS) and 2-butoxyethanol, two components of Corexit®. Around 1.8 million gallons of those dispersant formulations were used in the response efforts for the Gulf of Mexico oil spill in 2010. The methods presented here allow the trace-level detection of these compounds in seawater, crude oil and commercial dispersants formulations. In addition, two methodologies were developed for the analysis of well-known pollutants, namely Polycyclic Aromatic Hydrocarbons (PAHs) and airborne particulate matter (APM). PAHs are ubiquitous environmental contaminants and some are potent carcinogens. Traditional GC-MS analysis is labor-intensive and consumes large amounts of toxic solvents. My study provides an alternative automated SPE-LC-APPI-MS/MS analysis with minimal sample preparation and a lower solvent consumption. The system can inject, extract, clean, separate and detect 28 PAHs and 15 families of alkylated PAHs in 28 minutes. The methodology was tested with environmental samples from Miami. Airborne Particulate Matter is a mixture of particles of chemical and biological origin. Assessment of its elemental composition is critical for the protection of sensitive ecosystems and public health. The APM collected from Port Everglades between 2005 and 2010 was analyzed by ICP-MS after acid digestion of filters. The most abundant elements were Fe and Al, followed by Cu, V and Zn. Enrichment factors show that hazardous elements (Cd, Pb, As, Co, Ni and Cr) are introduced by anthropogenic activities. Data suggest that the major sources of APM were an electricity plant, road dust, industrial emissions and marine vessels.