14 resultados para Viscous Dampers,Five Step Method,Equivalent Static Analysis Procedure,Yielding Frames,Passive Energy Dissipation Systems
em Digital Commons at Florida International University
Resumo:
There is a growing body of literature that provides evidence for the efficacy of positive youth development programs in general and preliminary empirical support for the efficacy of the Changing Lives Program (CLP) in particular. This dissertation sought to extend previous efforts to develop and preliminarily examine the Transformative Goal Attainment Scale (TGAS) as a measure of participant empowerment in the promotion of positive development. Consistent with recent advances in the use of qualitative research methods, this dissertation sought to further investigate the utility of Relational Data Analysis (RDA) for providing categorizations of qualitative open-ended response data. In particular, a qualitative index of Transformative Goals, TG, was developed to complement the previously developed quantitative index of Transformative Goal Attainment (TGA), and RDA procedures for calculating reliability and content validity were refined. Second, as a Stage I pilot/feasibility study this study preliminarily examined the potentially mediating role of empowerment, as indexed by the TGAS, in the promotion of positive development. ^ Fifty-seven participants took part in this study, forty CLP intervention participants and seventeen control condition participants. All 57 participants were administered the study's measures just prior to and just following the fall 2003 semester. This study thus used a short-term longitudinal quasi-experimental research design with a comparison control group. ^ RDA procedures were refined and applied to the categorization of open-ended response data regarding participants' transformative goals (TG) and future possible selves (PSQ-QE). These analyses revealed relatively strong, indirect evidence for the construct validity of the categories as well as their theoretically meaningful structural organization, thereby providing sufficient support for the utility of RDA procedures in the categorization of qualitative open-ended response data. ^ In addition, transformative goals (TG) and future possible selves (PSQ-QE), and the quantitative index of perceived goal attainment (TGA) were evaluated as potential mediators of positive development by testing their relationships to other indices of positive intervention outcome within a four-step method involving both analysis of variance (ANOVA and RMANOVAs) and regression analysis. Though more limited in scope than the efforts at the development and refinement of the measures of these mediators, the results were also promising. ^
Resumo:
This study examined how the themes of environmental sustainability are evident in the national, state and local standards that guide k–12 science curriculum. The study applied the principles of content analysis within the framework of an ecological paradigm. In education, an ecological paradigm focuses on students' use of a holistic lens to view and understand material. The intent of this study was to analyze the seventh grade science content standards at the national, state, and local textbook levels to determine how and the extent to which each of the five themes of environmental sustainability are presented in the language of each text. The themes are: (a) Climate Change Indicators, (b) Biodiversity, (c) Human Population Density, (d) Impact and Presence of Environmental Pollution, (e) Earth as a Closed System. The research study offers practical insight on using a method of content analysis to locate keywords of environmental sustainability in the three texts and determine if the context of each term relates to this ecological paradigm. Using a concordance program, the researcher identified the frequency and context of each vocabulary item associated with these themes. Nine chi squares were run to determine if there were differences in content between the national and state standards and the textbook. Within each level chi squares were also run to determine if there were differences between the appearance of content knowledge and skill words. Results indicate that there is a lack of agreement between levels that is significant p < .01. A discussion of these results in relation to curriculum development and standardized assessments followed. The study found that at the national and state levels, there is a lack of articulation of the goals of environmental sustainability or an ecological paradigm. With respect to the science textbook, a greater number of keywords were present; however, the context of many of these keywords did not align with the discourse of an ecological paradigm. Further, the environmental sustainability themes present in the textbook were limited to the last four chapters of the text. Additional research is recommended to determine whether this situation also exists in other settings.
Resumo:
A comprehensive method for the analysis of 11 target pharmaceuticals representing multiple therapeutic classes was developed for biological tissues (fish) and water. Water samples were extracted using solid phase extraction (SPE), while fish tissue homogenates were extracted using accelerated solvent extraction (ASE) followed by mixed-mode cation exchange SPE cleanup and analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS). Among the 11 target pharmaceuticals analyzed, trimethoprim, caffeine, sulfamethoxazole, diphenhydramine, diltiazem, carbamazepine, erythromycin and fluoxetine were consistently detected in reclaimed water. On the other hand, caffeine, diphenhydramine and carbamazepine were consistently detected in fish and surface water samples. In order to understand the uptake and depuration of pharmaceuticals as well as bioconcentration factors (BCFs) under the worst-case conditions, mosquito fish were exposed to reclaimed water under static-renewal for 7 days, followed by a 14-day depuration phase in clean water. Characterization of the exposure media revealed the presence of 26 pharmaceuticals while 5 pharmaceuticals including caffeine, diphenhydramine, diltiazem, carbamazepine, and ibuprofen were present in the organisms as early as 5 h from the start of the exposure. Liquid chromatography ultra-high resolution Orbitrap mass spectrometry was explored as a tool to identify and quantify phase II pharmaceutical metabolites in reclaimed water. The resulting data confirmed the presence of acetyl-sulfamethoxazole and sulfamethoxazole glucuronide in reclaimed water. To my knowledge, this is the first known report of sulfamethoxazole glucuronide surviving intact through wastewater treatment plants and occurring in environmental water samples. Finally, five bioaccumulative pharmaceuticals including caffeine, carbamazepine, diltiazem, diphenhydramine and ibuprofen detected in reclaimed water were investigated regarding the acute and chronic risks to aquatic organisms. The results indicated a low potential risk of carbamazepine even under the worst case exposure scenario. Given the dilution factors that affect environmental releases, the risk of exposure to carbamazepine will be even more reduced.
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. ^ Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. ^ The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures—cash in advance and documentary credit—have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.^
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.
Resumo:
A comprehensive method for the analysis of 11 target pharmaceuticals representing multiple therapeutic classes was developed for biological tissues (fish) and water. Water samples were extracted using solid phase extraction (SPE), while fish tissue homogenates were extracted using accelerated solvent extraction (ASE) followed by mixed-mode cation exchange SPE cleanup and analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS). Among the 11 target pharmaceuticals analyzed, trimethoprim, caffeine, sulfamethoxazole, diphenhydramine, diltiazem, carbamazepine, erythromycin and fluoxetine were consistently detected in reclaimed water. On the other hand, caffeine, diphenhydramine and carbamazepine were consistently detected in fish and surface water samples. In order to understand the uptake and depuration of pharmaceuticals as well as bioconcentration factors (BCFs) under the worst-case conditions, mosquito fish were exposed to reclaimed water under static-renewal for 7 days, followed by a 14-day depuration phase in clean water. Characterization of the exposure media revealed the presence of 26 pharmaceuticals while 5 pharmaceuticals including caffeine, diphenhydramine, diltiazem, carbamazepine, and ibuprofen were present in the organisms as early as 5 h from the start of the exposure. Liquid chromatography ultra-high resolution Orbitrap mass spectrometry was explored as a tool to identify and quantify phase II pharmaceutical metabolites in reclaimed water. The resulting data confirmed the presence of acetyl-sulfamethoxazole and sulfamethoxazole glucuronide in reclaimed water. To my knowledge, this is the first known report of sulfamethoxazole glucuronide surviving intact through wastewater treatment plants and occurring in environmental water samples. Finally, five bioaccumulative pharmaceuticals including caffeine, carbamazepine, diltiazem, diphenhydramine and ibuprofen detected in reclaimed water were investigated regarding the acute and chronic risks to aquatic organisms. The results indicated a low potential risk of carbamazepine even under the worst case exposure scenario. Given the dilution factors that affect environmental releases, the risk of exposure to carbamazepine will be even more reduced.
Resumo:
The potential of solid phase microextraction (SPME) in the analysis of explosives is demonstrated. A sensitive, rapid, solventless and inexpensive method for the analysis of explosives and explosive odors from solid and liquid samples has been optimized using SPME followed by HPLC and GC/ECD. SPME involves the extraction of the organic components in debris samples into sorbent-coated silica fibers, which can be transferred directly to the injector of a gas chromatograph. SPME/HPLC requires a special desorption apparatus to elute the extracted analyte onto the column at high pressure. Results for use of GC/ECD is presented and compared to the results gathered by using HPLC analysis. The relative effects of controllable variables including fiber chemistry, adsorption and desorption temperature, extraction time, and desorption time have been optimized for various high explosives. ^
Resumo:
This research document is motivated by the need for a systemic, efficient quality improvement methodology at universities. There exists no methodology designed for a total quality management (TQM) program in a university. The main objective of this study is to develop a TQM Methodology that enables a university to efficiently develop an integral total quality improvement (TQM) Plan. ^ Current research focuses on the need of improving the quality of universities, the study of the perceived best quality universities, and the measurement of the quality of universities through rankings. There is no evidence of research on how to plan for an integral quality improvement initiative for the university as a whole, which is the main contribution of this study. ^ This research is built on various reference TQM models and criteria provided by ISO 9000, Baldrige and Six Sigma; and educational accreditation criteria found in ABET and SACS. The TQM methodology is proposed by following a seven-step metamethodology. The proposed methodology guides the user to develop a TQM plan in five sequential phases: initiation, assessment, analysis, preparation and acceptance. Each phase defines for the user its purpose, key activities, input requirements, controls, deliverables, and tools to use. The application of quality concepts in education and higher education is particular; since there are unique factors in education which ought to be considered. These factors shape the quality dimensions in a university and are the main inputs to the methodology. ^ The proposed TQM Methodology is used to guide the user to collect and transform appropriate inputs to a holistic TQM Plan, ready to be implemented by the university. Different input data will lead to a unique TQM plan for the specific university at the time. It may not necessarily transform the university into a world-class institution, but aims to strive for stakeholder-oriented improvements, leading to a better alignment with its mission and total quality advancement. ^ The proposed TQM methodology is validated in three steps. First, it is verified by going through a test activity as part of the meta-methodology. Secondly, the methodology is applied to a case university to develop a TQM plan. Lastly, the methodology and the TQM plan both are verified by an expert group consisting of TQM specialists and university administrators. The proposed TQM methodology is applicable to any university at all levels of advancement, regardless of changes in its long-term vision and short-term needs. It helps to assure the quality of a TQM plan, while making the process more systemic, efficient, and cost effective. This research establishes a framework with a solid foundation for extending the proposed TQM methodology into other industries. ^
Resumo:
Financial innovations have emerged globally to close the gap between the rising global demand for infrastructures and the availability of financing sources offered by traditional financing mechanisms such as fuel taxation, tax-exempt bonds, and federal and state funds. The key to sustainable innovative financing mechanisms is effective policymaking. This paper discusses the theoretical framework of a research study whose objective is to structurally and systemically assess financial innovations in global infrastructures. The research aims to create analysis frameworks, taxonomies and constructs, and simulation models pertaining to the dynamics of the innovation process to be used in policy analysis. Structural assessment of innovative financing focuses on the typologies and loci of innovations and evaluates the performance of different types of innovative financing mechanisms. Systemic analysis of innovative financing explores the determinants of the innovation process using the System of Innovation approach. The final deliverables of the research include propositions pertaining to the constituents of System of Innovation for infrastructure finance which include the players, institutions, activities, and networks. These static constructs are used to develop a hybrid Agent-Based/System Dynamics simulation model to derive propositions regarding the emergent dynamics of the system. The initial outcomes of the research study are presented in this paper and include: (a) an archetype for mapping innovative financing mechanisms, (b) a System of Systems-based analysis framework to identify the dimensions of Systems of Innovation analyses, and (c) initial observations regarding the players, institutions, activities, and networks of the System of Innovation in the context of the U.S. transportation infrastructure financing.
Resumo:
Airborne Light Detection and Ranging (LIDAR) technology has become the primary method to derive high-resolution Digital Terrain Models (DTMs), which are essential for studying Earth's surface processes, such as flooding and landslides. The critical step in generating a DTM is to separate ground and non-ground measurements in a voluminous point LIDAR dataset, using a filter, because the DTM is created by interpolating ground points. As one of widely used filtering methods, the progressive morphological (PM) filter has the advantages of classifying the LIDAR data at the point level, a linear computational complexity, and preserving the geometric shapes of terrain features. The filter works well in an urban setting with a gentle slope and a mixture of vegetation and buildings. However, the PM filter often removes ground measurements incorrectly at the topographic high area, along with large sizes of non-ground objects, because it uses a constant threshold slope, resulting in "cut-off" errors. A novel cluster analysis method was developed in this study and incorporated into the PM filter to prevent the removal of the ground measurements at topographic highs. Furthermore, to obtain the optimal filtering results for an area with undulating terrain, a trend analysis method was developed to adaptively estimate the slope-related thresholds of the PM filter based on changes of topographic slopes and the characteristics of non-terrain objects. The comparison of the PM and generalized adaptive PM (GAPM) filters for selected study areas indicates that the GAPM filter preserves the most "cut-off" points removed incorrectly by the PM filter. The application of the GAPM filter to seven ISPRS benchmark datasets shows that the GAPM filter reduces the filtering error by 20% on average, compared with the method used by the popular commercial software TerraScan. The combination of the cluster method, adaptive trend analysis, and the PM filter allows users without much experience in processing LIDAR data to effectively and efficiently identify ground measurements for the complex terrains in a large LIDAR data set. The GAPM filter is highly automatic and requires little human input. Therefore, it can significantly reduce the effort of manually processing voluminous LIDAR measurements.
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^
Resumo:
The objective of this study was to develop a GIS-based multi-class index overlay model to determine areas susceptible to inland flooding during extreme precipitation events in Broward County, Florida. Data layers used in the method include Airborne Laser Terrain Mapper (ALTM) elevation data, excess precipitation depth determined through performing a Soil Conservation Service (SCS) Curve Number (CN) analysis, and the slope of the terrain. The method includes a calibration procedure that uses "weights and scores" criteria obtained from Hurricane Irene (1999) records, a reported 100-year precipitation event, Doppler radar data and documented flooding locations. Results are displayed in maps of Eastern Broward County depicting types of flooding scenarios for a 100-year, 24-hour storm based on the soil saturation conditions. As expected the results of the multi-class index overlay analysis showed that an increase for the potential of inland flooding could be expected when a higher antecedent moisture condition is experienced. The proposed method proves to have some potential as a predictive tool for flooding susceptibility based on a relatively simple approach.
Resumo:
The potential of solid phase microextraction (SPME) in the analysis of explosives is demonstrated. A sensitive, rapid, solventless and inexpensive method for the analysis of explosives and explosive odors from solid and liquid samples has been optimized using SPME followed by HPLC and GC/ECD. SPME involves the extraction of the organic components in debris samples into sorbent-coated silica fibers, which can be transferred directly to the injector of a gas chromatograph. SPME/HPLC requires a special desorption apparatus to elute the extracted analyte onto the column at high pressure. Re suits for use of GC[ECD is presented and compared to the results gathered by using HPLC analysis. The relative effects of controllable variables including fiber chemistry, adsorption and desorption temperature, extraction time, and desorption time have been optimized for various high explosives.
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.