30 resultados para blood collection techniques
em Digital Commons at Florida International University
Resumo:
This single-case study provides a description and explanation of selected adult students' perspectives on the impact that the development of an experiential learning portfolio had on their understanding of their professional and personal lives. The conceptual framework that undergirded the study included theoretical and empirical studies on adult learning, experiential learning, and the academic quality of nontraditional degree programs with a portfolio component. The study employed qualitative data collection techniques of individual interviews, document review, field notes, and researcher journal. A purposive sample of 8 adult students who completed portfolios as a component of their undergraduate degrees participated in the study. The 4 male and 4 female students who were interviewed represented 4 ethnic/racial groups and ranged in age from 32 to 55 years. Each student's portfolio was read prior to the interview to frame the semi-structured interview questions in light of written portfolio documents. ^ Students were interviewed twice over a 3-month period. The study lasted 8 months from data collection to final presentation of the findings. The data from interview transcriptions and student portfolios were analyzed, categorized, coded, and sorted into 4 major themes and 2 additional themes and submitted to interpretive analysis. ^ Participants' attitudes, perceptions, and opinions of their learning from the portfolio development experience were presented in the findings, which were illustrated through the use of excerpts from interview responses and individual portfolios. The participants displayed a positive reaction to the learning they acquired from the portfolio development process, regardless of their initial concerns about the challenges of creating a portfolio. Concerns were replaced by a greater recognition and understanding of their previous professional and personal accomplishments and their ability to reach future goals. Other key findings included (a) a better understanding of the role work played in their learning and development, (b) a deeper recognition of the impact of mentors and role models throughout their lives, (c) an increase in writing and organizational competencies, and (d) a sense of self-discovery and personal empowerment. ^
Resumo:
Establishing an association between the scent a perpetrator left at a crime scene to the odor of the suspect of that crime is the basis for the use of human scent identification evidence in a court of law. Law enforcement agencies gather evidence through the collection of scent from the objects that a perpetrator may have handled during the execution of the criminal act. The collected scent evidence is consequently presented to the canines for identification line-up procedures with the apprehended suspects. Presently, canine scent identification is admitted as expert witness testimony, however, the accurate behavior of the dogs and the scent collection methods used are often challenged by the court system. The primary focus of this research project entailed an evaluation of contact and non-contact scent collection techniques with an emphasis on the optimization of collection materials of different fiber chemistries to evaluate the chemical odor profiles obtained using varying environment conditions to provide a better scientific understanding of human scent as a discriminative tool in the identification of suspects. The collection of hand odor from female and male subjects through both contact and non-contact sampling approaches yielded new insights into the types of VOCs collected when different materials are utilized, which had never been instrumentally performed. Furthermore, the collected scent mass was shown to be obtained in the highest amounts for both gender hand odor samples on cotton sorbent materials. Compared to non-contact sampling, the contact sampling methods yielded a higher number of volatiles, an enhancement of up to 3 times, as well as a higher scent mass than non-contact methods by more than an order of magnitude. The evaluation of the STU-100 as a non-contact methodology highlighted strong instrumental drawbacks that need to be targeted for enhanced scientific validation of current field practices. These results demonstrated that an individual's human scent components vary considerably depending on the method used to collect scent from the same body region. This study demonstrated the importance of collection medium selection as well as the collection method employed in providing a reproducible human scent sample that can be used to differentiate individuals.
Resumo:
Studies have shown that the environmental conditions of the home are important predictors of health, especially in low-income communities. Understanding the relationship between the environment and health is crucial in the management of certain diseases. One health outcome related to the home environment among urban, minority, and low-income children is childhood lead poisoning. The most common sources of lead exposure for children are lead paint in older, dilapidated housing and contaminated dust and soil produced by accumulated residue of leaded gasoline. Blood lead levels (BLL) as low as 10 μg/dL in children are associated with impaired cognitive function, behavior difficulties, and reduced intelligence. Recently, it is suggested that the standard for intervention be lowered to BLL of 5 μg /dl. The objectives of our report were to assess the prevalence of lead poisoning among children under six years of age and to quantify and test the correlations between BLL in children and lead exposure levels in their environment. This cross-sectional analysis was restricted to 75 children under six years of age who lived in 6 zip code areas of inner city Miami. These locations exhibited unacceptably high levels of lead dust and soil in areas where children live and play. Using the 5 μg/dL as the cutoff point, the prevalence of lead poisoning among the study sample was 13.33%. The study revealed that lead levels in floor dust and window sill samples were positively and significantly correlated with BLL among children (p < 0.05). However, the correlations between BLL and the soil, air, and water samples were not significant. Based on this pilot study, a more comprehensive environmental study in surrounding inner city areas is warranted. Parental education on proper housecleaning techniques may also benefit those living in the high lead-exposed communities of inner city Miami.
Resumo:
This dissertation develops a new figure of merit to measure the similarity (or dissimilarity) of Gaussian distributions through a novel concept that relates the Fisher distance to the percentage of data overlap. The derivations are expanded to provide a generalized mathematical platform for determining an optimal separating boundary of Gaussian distributions in multiple dimensions. Real-world data used for implementation and in carrying out feasibility studies were provided by Beckman-Coulter. It is noted that although the data used is flow cytometric in nature, the mathematics are general in their derivation to include other types of data as long as their statistical behavior approximate Gaussian distributions. ^ Because this new figure of merit is heavily based on the statistical nature of the data, a new filtering technique is introduced to accommodate for the accumulation process involved with histogram data. When data is accumulated into a frequency histogram, the data is inherently smoothed in a linear fashion, since an averaging effect is taking place as the histogram is generated. This new filtering scheme addresses data that is accumulated in the uneven resolution of the channels of the frequency histogram. ^ The qualitative interpretation of flow cytometric data is currently a time consuming and imprecise method for evaluating histogram data. This method offers a broader spectrum of capabilities in the analysis of histograms, since the figure of merit derived in this dissertation integrates within its mathematics both a measure of similarity and the percentage of overlap between the distributions under analysis. ^
Resumo:
This dissertation presents dynamic flow experiments with fluorescently labeled platelets to allow for spatial observation of wall attachment in inter-strut spacings, to investigate their relationship to flow patterns. Human blood with fluorescently labeled platelets was circulated through an in vitro system that produced physiologic pulsatile flow in (1) a parallel plate blow chamber that contained two-dimensional (2D) stents that feature completely recirculating flow, partially recirculating flow, and completely reattached flow, and (2) a three-dimensional (3D) cylindrical tube that contained stents of various geometric designs. ^ Flow detachment and reattachment points exhibited very low platelet deposition. Platelet deposition was very low in the recirculation regions in the 3D stents unlike the 2D stents. Deposition distal to a strut was always high in 2D and 3D stents. Spirally recirculating regions were found in 3D unlike in 2D stents, where the deposition was higher than at well-separated regions of recirculation. ^
Resumo:
The development of a new set of frost property measurement techniques to be used in the control of frost growth and defrosting processes in refrigeration systems was investigated. Holographic interferometry and infrared thermometry were used to measure the temperature of the frost-air interface, while a beam element load sensor was used to obtain the weight of a deposited frost layer. The proposed measurement techniques were tested for the cases of natural and forced convection, and the characteristic charts were obtained for a set of operational conditions. ^ An improvement of existing frost growth mathematical models was also investigated. The early stage of frost nucleation was commonly not considered in these models and instead an initial value of layer thickness and porosity was regularly assumed. A nucleation model to obtain the droplet diameter and surface porosity at the end of the early frosting period was developed. The drop-wise early condensation in a cold flat plate under natural convection to a hot (room temperature) and humid air was modeled. A nucleation rate was found, and the relation of heat to mass transfer (Lewis number) was obtained. It was found that the Lewis number was much smaller than unity, which is the standard value usually assumed for most frosting numerical models. The nucleation model was validated against available experimental data for the early nucleation and full growth stages of the frosting process. ^ The combination of frost top temperature and weight variation signals can now be used to control the defrosting timing and the developed early nucleation model can now be used to simulate the entire process of frost growth in any surface material. ^
Resumo:
The need to provide computers with the ability to distinguish the affective state of their users is a major requirement for the practical implementation of affective computing concepts. This dissertation proposes the application of signal processing methods on physiological signals to extract from them features that can be processed by learning pattern recognition systems to provide cues about a person's affective state. In particular, combining physiological information sensed from a user's left hand in a non-invasive way with the pupil diameter information from an eye-tracking system may provide a computer with an awareness of its user's affective responses in the course of human-computer interactions. In this study an integrated hardware-software setup was developed to achieve automatic assessment of the affective status of a computer user. A computer-based "Paced Stroop Test" was designed as a stimulus to elicit emotional stress in the subject during the experiment. Four signals: the Galvanic Skin Response (GSR), the Blood Volume Pulse (BVP), the Skin Temperature (ST) and the Pupil Diameter (PD), were monitored and analyzed to differentiate affective states in the user. Several signal processing techniques were applied on the collected signals to extract their most relevant features. These features were analyzed with learning classification systems, to accomplish the affective state identification. Three learning algorithms: Naïve Bayes, Decision Tree and Support Vector Machine were applied to this identification process and their levels of classification accuracy were compared. The results achieved indicate that the physiological signals monitored do, in fact, have a strong correlation with the changes in the emotional states of the experimental subjects. These results also revealed that the inclusion of pupil diameter information significantly improved the performance of the emotion recognition system. ^
Resumo:
The purpose of this thesis was to build the Guitar Application ToolKit (GATK), a series of applications used to expand the sonic capabilities of the acoustic/electric stereo guitar. Furthermore, the goal of the GATK was to extend improvisational capabilities and the compositional techniques generated by this innovative instrument. ^ During the GATK creation process, the current production guitar techniques and overall sonic result were enhanced by planning and implementing a personalized electro-acoustic performance set up, designing custom-made performance interfaces, creating interactive compositional strategies, crafting non-standardized sounds, and controlling various music parameters in real-time using the Max/MSP programming environment. ^ This was the fast thesis project of its kind. It is expected that this thesis will be useful as a reference paper for electronic musicians and music technology students; as a product demonstration for companies that manufacture the relevant software; and as a personal portfolio for future technology related jobs. ^
Resumo:
This dissertation research project addressed the question of how hydrologic restoration of the Everglades is impacting the nutrient dynamics of marsh ecosystems in the southern Everglades. These effects were analyzed by quantifying nitrogen (N) cycle dynamics in the region. I utilized stable isotope tracer techniques to investigate nitrogen uptake and cycling between the major ecosystem components of the freshwater marsh system. I recorded the natural isotopic signatures (δ15N and δ 13C) for major ecosystem components from the three major watersheds of the Everglades: Shark River Slough, Taylor Slough, and C-111 basin. Analysis of δ15 N and δ13C natural abundance data were used to demonstrate the spatial extent to which nitrogen from anthropogenic or naturally enriched sources is entering the marshes of the Everglades. In addition, I measured the fluxes on N between various ecosystem components at both near-canal and estuarine ecotone locations. Lastly, I investigated the effect of three phosphorus load treatments (0.00 mg P m-2, 6.66 mg P m-2, and 66.6 mg P m-2) on the rate and magnitude of ecosystem N-uptake and N-cycling. The δ15N and δ13C natural abundance data supported the hypothesis that ecosystem components from near-canal sites have heavier, more enriched δ 15N isotopic signatures than downstream sites. The natural abundance data also showed that the marshes of the southern Everglades are acting as a sink for isotopically heavier, canal-borne dissolved inorganic nitrogen (DIN) and a source for "new" marsh derived dissolved organic nitrogen (DON). In addition, the 15N mesocosm data showed the rapid assimilation of the 15N tracer by the periphyton component and the delayed N uptake by soil and macrophyte components in the southern Everglades.
Resumo:
The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^
Resumo:
This research is to establish new optimization methods for pattern recognition and classification of different white blood cells in actual patient data to enhance the process of diagnosis. Beckman-Coulter Corporation supplied flow cytometry data of numerous patients that are used as training sets to exploit the different physiological characteristics of the different samples provided. The methods of Support Vector Machines (SVM) and Artificial Neural Networks (ANN) were used as promising pattern classification techniques to identify different white blood cell samples and provide information to medical doctors in the form of diagnostic references for the specific disease states, leukemia. The obtained results prove that when a neural network classifier is well configured and trained with cross-validation, it can perform better than support vector classifiers alone for this type of data. Furthermore, a new unsupervised learning algorithm---Density based Adaptive Window Clustering algorithm (DAWC) was designed to process large volumes of data for finding location of high data cluster in real-time. It reduces the computational load to ∼O(N) number of computations, and thus making the algorithm more attractive and faster than current hierarchical algorithms.
Resumo:
Distributed applications are exposed as reusable components that are dynamically discovered and integrated to create new applications. These new applications, in the form of aggregate services, are vulnerable to failure due to the autonomous and distributed nature of their integrated components. This vulnerability creates the need for adaptability in aggregate services. The need for adaptation is accentuated for complex long-running applications as is found in scientific Grid computing, where distributed computing nodes may participate to solve computation and data-intensive problems. Such applications integrate services for coordinated problem solving in areas such as Bioinformatics. For such applications, when a constituent service fails, the application fails, even though there are other nodes that can substitute for the failed service. This concern is not addressed in the specification of high-level composition languages such as that of the Business Process Execution Language (BPEL). We propose an approach to transparently autonomizing existing BPEL processes in order to make them modifiable at runtime and more resilient to the failures in their execution environment. By transparent introduction of adaptive behavior, adaptation preserves the original business logic of the aggregate service and does not tangle the code for adaptive behavior with that of the aggregate service. The major contributions of this dissertation are: first, we assessed the effectiveness of BPEL language support in developing adaptive mechanisms. As a result, we identified the strengths and limitations of BPEL and came up with strategies to address those limitations. Second, we developed a technique to enhance existing BPEL processes transparently in order to support dynamic adaptation. We proposed a framework which uses transparent shaping and generative programming to make BPEL processes adaptive. Third, we developed a technique to dynamically discover and bind to substitute services. Our technique was evaluated and the result showed that dynamic utilization of components improves the flexibility of adaptive BPEL processes. Fourth, we developed an extensible policy-based technique to specify how to handle exceptional behavior. We developed a generic component that introduces adaptive behavior for multiple BPEL processes. Fifth, we identify ways to apply our work to facilitate adaptability in composite Grid services.
Resumo:
The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.
Resumo:
Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^
Resumo:
A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. ^ The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. ^ A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks. ^