958 resultados para Bowker Collection Analysis Tool
Resumo:
Software architecture is the abstract design of a software system. It plays a key role as a bridge between requirements and implementation, and is a blueprint for development. The architecture represents a set of early design decisions that are crucial to a system. Mistakes in those decisions are very costly if they remain undetected until the system is implemented and deployed. This is where formal specification and analysis fits in. Formal specification makes sure that an architecture design is represented in a rigorous and unambiguous way. Furthermore, a formally specified model allows the use of different analysis techniques for verifying the correctness of those crucial design decisions. ^ This dissertation presented a framework, called SAM, for formal specification and analysis of software architectures. In terms of specification, formalisms and mechanisms were identified and chosen to specify software architecture based on different analysis needs. Formalisms for specifying properties were also explored, especially in the case of non-functional properties. In terms of analysis, the dissertation explored both the verification of functional properties and the evaluation of non-functional properties of software architecture. For the verification of functional property, methodologies were presented on how to apply existing model checking techniques on a SAM model. For the evaluation of non-functional properties, the dissertation first showed how to incorporate stochastic information into a SAM model, and then explained how to translate the model to existing tools and conducts the analysis using those tools. ^ To alleviate the analysis work, we also provided a tool to automatically translate a SAM model for model checking. All the techniques and methods described in the dissertation were illustrated by examples or case studies, which also served a purpose of advocating the use of formal methods in practice. ^
Resumo:
This dissertation presents a unique research opportunity by using recordings which provide electrocardiogram (ECG) plus a reference breathing signal (RBS). ECG derived breathing (EDR) is measured and correlated against RBS. Standard deviations of multiresolution wavelet analysis coefficients (SDMW) are obtained from heart rate and classified using RBS. Prior works by others used select patients for sleep apnea scoring with EDR but no RBS. Another prior work classified select heart disease patients with SDMW but no RBS. This study used randomly chosen sleep disorder patient recordings; central and obstructive apneas, with and without heart disease.^ Implementation required creating an application because existing systems were limited in power and scope. A review survey was created to choose a development environment. The survey is presented as a learning tool and teaching resource. Development objectives were rapid development using limited resources (manpower and money). Open Source resources were used exclusively for implementation. ^ Results show: (1) Three groups of patients exist in the study. Grouping RBS correlations shows a response with either ECG interval or amplitude variation. A third group exists where neither ECG intervals nor amplitude variation correlate with breathing. (2) Previous work done by other groups analyzed SDMW. Similar results were found in this study but some subjects had higher SDMW, attributed to a large number of apneas, arousals and/or disconnects. SDMW does not need RBS to show apneic conditions exist within ECG recordings. (3) Results in this study support the assertion that autonomic nervous system variation was measured with SDMW. Measurements using RBS are not corrupted due to breathing even though respiration overlaps the same frequency band.^ Overall, this work becomes an Open Source resource which can be reused, modified and/or expanded. It might fast track additional research. In the future the system could also be used for public domain data. Prerecorded data exist in similar formats in public databases which could provide additional research opportunities. ^
Resumo:
Today, the development of domain-specific communication applications is both time-consuming and error-prone because the low-level communication services provided by the existing systems and networks are primitive and often heterogeneous. Multimedia communication applications are typically built on top of low-level network abstractions such as TCP/UDP socket, SIP (Session Initiation Protocol) and RTP (Real-time Transport Protocol) APIs. The User-centric Communication Middleware (UCM) is proposed to encapsulate the networking complexity and heterogeneity of basic multimedia and multi-party communication for upper-layer communication applications. And UCM provides a unified user-centric communication service to diverse communication applications ranging from a simple phone call and video conferencing to specialized communication applications like disaster management and telemedicine. It makes it easier to the development of domain-specific communication applications. The UCM abstraction and API is proposed to achieve these goals. The dissertation also tries to integrate the formal method into UCM development process. The formal model is created for UCM using SAM methodology. Some design errors are found during model creation because the formal method forces to give the precise description of UCM. By using the SAM tool, formal UCM model is translated to Promela formula model. In the dissertation, some system properties are defined as temporal logic formulas. These temporal logic formulas are manually translated to promela formulas which are individually integrated with promela formula model of UCM and verified using SPIN tool. Formal analysis used here helps verify the system properties (for example multiparty multimedia protocol) and dig out the bugs of systems.
Resumo:
The purpose of this research was to demonstrate the applicability of reduced-size STR (Miniplex) primer sets to challenging samples and to provide the forensic community with new information regarding the analysis of degraded and inhibited DNA. The Miniplex primer sets were validated in accordance with guidelines set forth by the Scientific Working Group on DNA Analysis Methods (SWGDAM) in order to demonstrate the scientific validity of the kits. The Miniplex sets were also used in the analysis of DNA extracted from human skeletal remains and telogen hair. In addition, a method for evaluating the mechanism of PCR inhibition was developed using qPCR. The Miniplexes were demonstrated to be a robust and sensitive tool for the analysis of DNA with as low as 100 pg of template DNA. They also proved to be better than commercial kits in the analysis of DNA from human skeletal remains, with 64% of samples tested producing full profiles, compared to 16% for a commercial kit. The Miniplexes also produced amplification of nuclear DNA from human telogen hairs, with partial profiles obtained from as low as 60 pg of template DNA. These data suggest smaller PCR amplicons may provide a useful alternative to mitochondrial DNA for forensic analysis of degraded DNA from human skeletal remains, telogen hairs, and other challenging samples. In the evaluation of inhibition by qPCR, the effect of amplicon length and primer melting temperature was evaluated in order to determine the binding mechanisms of different PCR inhibitors. Several mechanisms were indicated by the inhibitors tested, including binding of the polymerase, binding to the DNA, and effects on the processivity of the polymerase during primer extension. The data obtained from qPCR illustrated a method by which the type of inhibitor could be inferred in forensic samples, and some methods of reducing inhibition for specific inhibitors were demonstrated. An understanding of the mechanism of the inhibitors found in forensic samples will allow analysts to select the proper methods for inhibition removal or the type of analysis that can be performed, and will increase the information that can be obtained from inhibited samples.
Resumo:
The integration of automation (specifically Global Positioning Systems (GPS)) and Information and Communications Technology (ICT) through the creation of a Total Jobsite Management Tool (TJMT) in construction contractor companies can revolutionize the way contractors do business. The key to this integration is the collection and processing of real-time GPS data that is produced on the jobsite for use in project management applications. This research study established the need for an effective planning and implementation framework to assist construction contractor companies in navigating the terrain of GPS and ICT use. An Implementation Framework was developed using the Action Research approach. The framework consists of three components, as follows: (i) ICT Infrastructure Model, (ii) Organizational Restructuring Model, and (iii) Cost/Benefit Analysis. The conceptual ICT infrastructure model was developed for the purpose of showing decision makers within highway construction companies how to collect, process, and use GPS data for project management applications. The organizational restructuring model was developed to assist companies in the analysis and redesign of business processes, data flows, core job responsibilities, and their organizational structure in order to obtain the maximum benefit at the least cost in implementing GPS as a TJMT. A cost-benefit analysis which identifies and quantifies the cost and benefits (both direct and indirect) was performed in the study to clearly demonstrate the advantages of using GPS as a TJMT. Finally, the study revealed that in order to successfully implement a program to utilize GPS data as a TJMT, it is important for construction companies to understand the various implementation and transitioning issues that arise when implementing this new technology and business strategy. In the study, Factors for Success were identified and ranked to allow a construction company to understand the factors that may contribute to or detract from the prospect for success during implementation. The Implementation Framework developed as a result of this study will serve to guide highway construction companies in the successful integration of GPS and ICT technologies for use as a TJMT.
Resumo:
Today, over 15,000 Ion Mobility Spectrometry (IMS) analyzers are employed at worldwide security checkpoints to detect explosives and illicit drugs. Current portal IMS instruments and other electronic nose technologies detect explosives and drugs by analyzing samples containing the headspace air and loose particles residing on a surface. Canines can outperform these systems at sampling and detecting the low vapor pressure explosives and drugs, such as RDX, PETN, cocaine, and MDMA, because these biological detectors target the volatile signature compounds available in the headspace rather than the non-volatile parent compounds of explosives and drugs.^ In this dissertation research volatile signature compounds available in the headspace over explosive and drug samples were detected using SPME as a headspace sampling tool coupled to an IMS analyzer. A Genetic Algorithm (GA) technique was developed to optimize the operating conditions of a commercial IMS (GE Itemizer 2), leading to the successful detection of plastic explosives (Detasheet, Semtex H, and C-4) and illicit drugs (cocaine, MDMA, and marijuana). Short sampling times (between 10 sec to 5 min) were adequate to extract and preconcentrate sufficient analytes (> 20 ng) representing the volatile signatures in the headspace of a 15 mL glass vial or a quart-sized can containing ≤ 1 g of the bulk explosive or drug.^ Furthermore, a research grade IMS with flexibility for changing operating conditions and physical configurations was designed and fabricated to accommodate future research into different analytes or physical configurations. The design and construction of the FIU-IMS were facilitated by computer modeling and simulation of ion’s behavior within an IMS. The simulation method developed uses SIMION/SDS and was evaluated with experimental data collected using a commercial IMS (PCP Phemto Chem 110). The FIU-IMS instrument has comparable performance to the GE Itemizer 2 (average resolving power of 14, resolution of 3 between two drugs and two explosives, and LODs range from 0.7 to 9 ng). ^ The results from this dissertation further advance the concept of targeting volatile components to presumptively detect the presence of concealed bulk explosives and drugs by SPME-IMS, and the new FIU-IMS provides a flexible platform for future IMS research projects.^
Resumo:
This paper details the research methods an introductory qualitative research class used to both study an issue related to race and identity, and to familiarize themselves with data collection strategies. Throughout the paper the authors attempt to capture the challenges, disagreements, and consensus building that marked this unusual research endeavor.
Resumo:
Cotton is the most abundant natural fiber in the world. Many countries are involved in the growing, importation, exportation and production of this commodity. Paper documentation claiming geographic origin is the current method employed at U.S. ports for identifying cotton sources and enforcing tariffs. Because customs documentation can be easily falsified, it is necessary to develop a robust method for authenticating or refuting the source of the cotton commodities. This work presents, for the first time, a comprehensive approach to the chemical characterization of unprocessed cotton in order to provide an independent tool to establish geographic origin. Elemental and stable isotope ratio analysis of unprocessed cotton provides a means to increase the ability to distinguish cotton in addition to any physical and morphological examinations that could be, and are currently performed. Elemental analysis has been conducted using LA-ICP-MS, LA-ICP-OES and LIBS in order to offer a direct comparison of the analytical performance of each technique and determine the utility of each technique for this purpose. Multivariate predictive modeling approaches are used to determine the potential of elemental and stable isotopic information to aide in the geographic provenancing of unprocessed cotton of both domestic and foreign origin. These approaches assess the stability of the profiles to temporal and spatial variation to determine the feasibility of this application. This dissertation also evaluates plasma conditions and ablation processes so as to improve the quality of analytical measurements made using atomic emission spectroscopy techniques. These interactions, in LIBS particularly, are assessed to determine any potential simplification of the instrumental design and method development phases. This is accomplished through the analysis of several matrices representing different physical substrates to determine the potential of adopting universal LIBS parameters for 532 nm and 1064 nm LIBS for some important operating parameters. A novel approach to evaluate both ablation processes and plasma conditions using a single measurement was developed and utilized to determine the "useful ablation efficiency" for different materials. The work presented here demonstrates the potential for an a priori prediction of some probable laser parameters important in analytical LIBS measurement.
Resumo:
The primary goal of this dissertation is the study of patterns of viral evolution inferred from serially-sampled sequence data, i.e., sequence data obtained from strains isolated at consecutive time points from a single patient or host. RNA viral populations have an extremely high genetic variability, largely due to their astronomical population sizes within host systems, high replication rate, and short generation time. It is this aspect of their evolution that demands special attention and a different approach when studying the evolutionary relationships of serially-sampled sequence data. New methods that analyze serially-sampled data were developed shortly after a groundbreaking HIV-1 study of several patients from which viruses were isolated at recurring intervals over a period of 10 or more years. These methods assume a tree-like evolutionary model, while many RNA viruses have the capacity to exchange genetic material with one another using a process called recombination. ^ A genealogy involving recombination is best described by a network structure. A more general approach was implemented in a new computational tool, Sliding MinPD, one that is mindful of the sampling times of the input sequences and that reconstructs the viral evolutionary relationships in the form of a network structure with implicit representations of recombination events. The underlying network organization reveals unique patterns of viral evolution and could help explain the emergence of disease-associated mutants and drug-resistant strains, with implications for patient prognosis and treatment strategies. In order to comprehensively test the developed methods and to carry out comparison studies with other methods, synthetic data sets are critical. Therefore, appropriate sequence generators were also developed to simulate the evolution of serially-sampled recombinant viruses, new and more through evaluation criteria for recombination detection methods were established, and three major comparison studies were performed. The newly developed tools were also applied to "real" HIV-1 sequence data and it was shown that the results represented within an evolutionary network structure can be interpreted in biologically meaningful ways. ^
Resumo:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. ^ There are two issues in using HLPNs—modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. ^ For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. ^ For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. ^ The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.^
Resumo:
Speckle is being used as a characterization tool for the analysis of the dynamic of slow varying phenomena occurring in biological and industrial samples. The retrieved data takes the form of a sequence of speckle images. The analysis of these images should reveal the inner dynamic of the biological or physical process taking place in the sample. Very recently, it has been shown that principal component analysis is able to split the original data set in a collection of classes. These classes can be related with the dynamic of the observed phenomena. At the same time, statistical descriptors of biospeckle images have been used to retrieve information on the characteristics of the sample. These statistical descriptors can be calculated in almost real time and provide a fast monitoring of the sample. On the other hand, principal component analysis requires longer computation time but the results contain more information related with spatial-temporal pattern that can be identified with physical process. This contribution merges both descriptions and uses principal component analysis as a pre-processing tool to obtain a collection of filtered images where a simpler statistical descriptor can be calculated. The method has been applied to slow-varying biological and industrial processes
Resumo:
Cancer comprises a collection of diseases, all of which begin with abnormal tissue growth from various stimuli, including (but not limited to): heredity, genetic mutation, exposure to harmful substances, radiation as well as poor dieting and lack of exercise. The early detection of cancer is vital to providing life-saving, therapeutic intervention. However, current methods for detection (e.g., tissue biopsy, endoscopy and medical imaging) often suffer from low patient compliance and an elevated risk of complications in elderly patients. As such, many are looking to “liquid biopsies” for clues into presence and status of cancer due to its minimal invasiveness and ability to provide rich information about the native tumor. In such liquid biopsies, peripheral blood is drawn from patients and is screened for key biomarkers, chiefly circulating tumor cells (CTCs). Capturing, enumerating and analyzing the genetic and metabolomic characteristics of these CTCs may hold the key for guiding doctors to better understand the source of cancer at an earlier stage for more efficacious disease management.
The isolation of CTCs from whole blood, however, remains a significant challenge due to their (i) low abundance, (ii) lack of a universal surface marker and (iii) epithelial-mesenchymal transition that down-regulates common surface markers (e.g., EpCAM), reducing their likelihood of detection via positive selection assays. These factors potentiate the need for an improved cell isolation strategy that can collect CTCs via both positive and negative selection modalities as to avoid the reliance on a single marker, or set of markers, for more accurate enumeration and diagnosis.
The technologies proposed herein offer a unique set of strategies to focus, sort and template cells in three independent microfluidic modules. The first module exploits ultrasonic standing waves and a class of elastomeric particles for the rapid and discriminate sequestration of cells. This type of cell handling holds promise not only in sorting, but also in the isolation of soluble markers from biofluids. The second module contains components to focus (i.e., arrange) cells via forces from acoustic standing waves and separate cells in a high throughput fashion via free-flow magnetophoresis. The third module uses a printed array of micromagnets to capture magnetically labeled cells into well-defined compartments, enabling on-chip staining and single cell analysis. These technologies can operate in standalone formats, or can be adapted to operate with established analytical technologies, such as flow cytometry. A key advantage of these innovations is their ability to process erythrocyte-lysed blood in a rapid (and thus high throughput) fashion. They can process fluids at a variety of concentrations and flow rates, target cells with various immunophenotypes and sort cells via positive (and potentially negative) selection. These technologies are chip-based, fabricated using standard clean room equipment, towards a disposable clinical tool. With further optimization in design and performance, these technologies might aid in the early detection, and potentially treatment, of cancer and various other physical ailments.
Resumo:
Concept maps are a technique used to obtain a visual representation of a person's ideas about a concept or a set of related concepts. Specifically, in this paper, through a qualitative methodology, we analyze the concept maps proposed by 52 groups of teacher training students in order to find out the characteristics of the maps and the degree of adequacy of the contents with regard to the teaching of human nutrition in the 3rd cycle of primary education. The participants were enrolled in the Teacher Training Degree majoring in Primary Education, and the data collection was carried out through a training activity under the theme of what to teach about Science in Primary School? The results show that the maps are a useful tool for working in teacher education as they allow organizing, synthesizing, and communicating what students know. Moreover, through this work, it has been possible to see that future teachers have acceptable skills for representing the concepts/ideas in a concept map, although the level of adequacy of concepts/ideas about human nutrition and its relations is usually medium or low. These results are a wake-up call for teacher training, both initial and ongoing, because they shows the inability to change priorities as far as the selection of content is concerned.
Resumo:
Qualitative Comparative Analysis (QCA) is a method for the systematic analysis of cases. A holistic view of cases and an approach to causality emphasizing complexity are some of its core features. Over the last decades, QCA has found application in many fields of the social sciences. In spite of this, its use in feminist research has been slower, and only recently QCA has been applied to topics related to social care, the political representation of women, and reproductive politics. In spite of the comparative turn in feminist studies, researchers still privilege qualitative methods, in particular case studies, and are often skeptical of quantitative techniques (Spierings 2012). These studies show that the meaning and measurement of many gender concepts differ across countries and that the factors leading to feminist success and failure are context specific. However, case study analyses struggle to systematically account for the ways in which these forces operate in different locations.
Resumo:
One of the most challenging task underlying many hyperspectral imagery applications is the spectral unmixing, which decomposes a mixed pixel into a collection of reectance spectra, called endmember signatures, and their corresponding fractional abundances. Independent Component Analysis (ICA) have recently been proposed as a tool to unmix hyperspectral data. The basic goal of ICA is to nd a linear transformation to recover independent sources (abundance fractions) given only sensor observations that are unknown linear mixtures of the unobserved independent sources. In hyperspectral imagery the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be independent. This paper address hyperspectral data source dependence and its impact on ICA performance. The study consider simulated and real data. In simulated scenarios hyperspectral observations are described by a generative model that takes into account the degradation mechanisms normally found in hyperspectral applications. We conclude that ICA does not unmix correctly all sources. This conclusion is based on the a study of the mutual information. Nevertheless, some sources might be well separated mainly if the number of sources is large and the signal-to-noise ratio (SNR) is high.