8 resultados para Statistics in sensory analysis

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research was to demonstrate the applicability of reduced-size STR (Miniplex) primer sets to challenging samples and to provide the forensic community with new information regarding the analysis of degraded and inhibited DNA. The Miniplex primer sets were validated in accordance with guidelines set forth by the Scientific Working Group on DNA Analysis Methods (SWGDAM) in order to demonstrate the scientific validity of the kits. The Miniplex sets were also used in the analysis of DNA extracted from human skeletal remains and telogen hair. In addition, a method for evaluating the mechanism of PCR inhibition was developed using qPCR. The Miniplexes were demonstrated to be a robust and sensitive tool for the analysis of DNA with as low as 100 pg of template DNA. They also proved to be better than commercial kits in the analysis of DNA from human skeletal remains, with 64% of samples tested producing full profiles, compared to 16% for a commercial kit. The Miniplexes also produced amplification of nuclear DNA from human telogen hairs, with partial profiles obtained from as low as 60 pg of template DNA. These data suggest smaller PCR amplicons may provide a useful alternative to mitochondrial DNA for forensic analysis of degraded DNA from human skeletal remains, telogen hairs, and other challenging samples. In the evaluation of inhibition by qPCR, the effect of amplicon length and primer melting temperature was evaluated in order to determine the binding mechanisms of different PCR inhibitors. Several mechanisms were indicated by the inhibitors tested, including binding of the polymerase, binding to the DNA, and effects on the processivity of the polymerase during primer extension. The data obtained from qPCR illustrated a method by which the type of inhibitor could be inferred in forensic samples, and some methods of reducing inhibition for specific inhibitors were demonstrated. An understanding of the mechanism of the inhibitors found in forensic samples will allow analysts to select the proper methods for inhibition removal or the type of analysis that can be performed, and will increase the information that can be obtained from inhibited samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study attempts to understand the nature of knowledge base that supports the ability to select statistical techniques for research situations. Findings showed that the largest component of such knowledge was related to research design. One implication is that techniques should be taught in relation to features of research design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis is two-fold. It presents six original pieces composed and arranged by the author, and it provides a thorough analysis of each. The compositions draw from many different musical genres: contemporary jazz, swing, funk, fusion, soul, neo-soul, and rhythm and blues. The applications of melodic, harmonic, and rhythmic techniques derived from these genres can be found in these original compositions. These compositions are inspired by -- and attempt to narrate --life experiences. Parallels between life and music are drawn and explained. By way of introduction, some information is given regarding the ensemble that first performed these original compositions. The ensemble comprised trumpet, tenor saxophone, keyboards, piano, electric bass, upright acoustic bass, drums, and percussion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines how public management practitioners in small and medium-sized Florida cities perceive globalization and its impact on public management practice. Using qualitative analysis, descriptive statistics and factor analysis methods, data obtained from a survey and semi-structured interviews were studied to comprehend how public managers view the management and control of their municipalities in a time of globalization. The study shows that the public managers’ perceptions of globalization and its impact on public management in Florida’s small-medium cities are nuanced. Whereas some public managers feel that globalization has significant impacts on municipalities’ viability, others opine that globalization has no local impact. The study further finds that globalization processes are perceived as altering the public management functions of decision-making, economic development and service delivery in some small-medium cities in Florida as a result of transnational shifts, rapidly changing technologies, and municipalities’ heightened involvement in the global economy. The study concludes that the globalization discourse does not resonate among some public managers in Florida’s small-medium cities in ways implied in extant literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Financial innovations have emerged globally to close the gap between the rising global demand for infrastructures and the availability of financing sources offered by traditional financing mechanisms such as fuel taxation, tax-exempt bonds, and federal and state funds. The key to sustainable innovative financing mechanisms is effective policymaking. This paper discusses the theoretical framework of a research study whose objective is to structurally and systemically assess financial innovations in global infrastructures. The research aims to create analysis frameworks, taxonomies and constructs, and simulation models pertaining to the dynamics of the innovation process to be used in policy analysis. Structural assessment of innovative financing focuses on the typologies and loci of innovations and evaluates the performance of different types of innovative financing mechanisms. Systemic analysis of innovative financing explores the determinants of the innovation process using the System of Innovation approach. The final deliverables of the research include propositions pertaining to the constituents of System of Innovation for infrastructure finance which include the players, institutions, activities, and networks. These static constructs are used to develop a hybrid Agent-Based/System Dynamics simulation model to derive propositions regarding the emergent dynamics of the system. The initial outcomes of the research study are presented in this paper and include: (a) an archetype for mapping innovative financing mechanisms, (b) a System of Systems-based analysis framework to identify the dimensions of Systems of Innovation analyses, and (c) initial observations regarding the players, institutions, activities, and networks of the System of Innovation in the context of the U.S. transportation infrastructure financing.