890 resultados para Context Model
Resumo:
The need for efficient, sustainable, and planned utilization of resources is ever more critical. In the U.S. alone, buildings consume 34.8 Quadrillion (1015) BTU of energy annually at a cost of $1.4 Trillion. Of this energy 58% is utilized for heating and air conditioning. Several building energy analysis tools have been developed to assess energy demands and lifecycle energy costs in buildings. Such analyses are also essential for an efficient HVAC design that overcomes the pitfalls of an under/over-designed system. DOE-2 is among the most widely known full building energy analysis models. It also constitutes the simulation engine of other prominent software such as eQUEST, EnergyPro, PowerDOE. Therefore, it is essential that DOE-2 energy simulations be characterized by high accuracy. Infiltration is an uncontrolled process through which outside air leaks into a building. Studies have estimated infiltration to account for up to 50% of a building’s energy demand. This, considered alongside the annual cost of buildings energy consumption, reveals the costs of air infiltration. It also stresses the need that prominent building energy simulation engines accurately account for its impact. In this research the relative accuracy of current air infiltration calculation methods is evaluated against an intricate Multiphysics Hygrothermal CFD building envelope analysis. The full-scale CFD analysis is based on a meticulous representation of cracking in building envelopes and on real-life conditions. The research found that even the most advanced current infiltration methods, including in DOE-2, are at up to 96.13% relative error versus CFD analysis. An Enhanced Model for Combined Heat and Air Infiltration Simulation was developed. The model resulted in 91.6% improvement in relative accuracy over current models. It reduces error versus CFD analysis to less than 4.5% while requiring less than 1% of the time required for such a complex hygrothermal analysis. The algorithm used in our model was demonstrated to be easy to integrate into DOE-2 and other engines as a standalone method for evaluating infiltration heat loads. This will vastly increase the accuracy of such simulation engines while maintaining their speed and ease of use characteristics that make them very widely used in building design.
Resumo:
Simarouba glauca, a non-edible oilseed crop native to South Florida, is gaining popularity as a feedstock for the production of biodiesel. The University of Agriculture Sciences in Bangalore, India has developed a biodiesel production model based on the principles of decentralization, small scales, and multiple fuel sources. Success of such a program depends on conversion efficiencies at multiple stages. The conversion efficiency of the field-level, decentralized production model was compared with the in-laboratory conversion efficiency benchmark. The study indicated that the field-level model conversion efficiency was less than that of the lab-scale set up. The fuel qualities and characteristics of the Simarouba glauca biodiesel were tested and found to be the standards required for fuel designation. However, this research suggests that for Simarouba glauca to be widely accepted as a biodiesel feedstock further investigation is still required.
Resumo:
Personality has long been linked to performance. Evolutions in this relationship have brought forward new questions regarding the true nature of how personality impacts performance. Both direct and indirect relationships have been proven significant. This study further investigated potential indirect relationships by including a mediating variable, mental model formation, in the personality-performance relationship. Undergraduate students were assessed in a 6-week period, Time 1 - Time 2 experiment. Conceptualizations of personality included measures of the Big 5 model and Self-efficacy, with performance measured by content quiz and overall course scores. Findings showed that the Big 5 personality traits, extraversion and agreeableness, positively and significantly impacted commonality with the instructor’s mental model. However, commonality with the instructor’s mental model did not impact performance. In comparison, commonality with an expert mental model positively and significantly impacted performance for both the content quiz and overall course score. Furthermore, similarity with an expert mental model positively and significantly impacted overall course performance. Hypothesized full mediation of mental model formation for the personality-performance relationship was not supported due to a lack of direct effect relationships required for mediation. However, a revised conceptualization of results emerged. Findings from the current study point to the novel and unique role mental models play in the personality-performance relationship. While personality traits do impact mental model formation, accuracy in the mental models formed is critical to performance.
Resumo:
Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry’s standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device. This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.
Resumo:
With evidence of increasing hurricane risks in Georgia Coastal Area (GCA) and Virginia in the U.S. Southeast and elsewhere, understanding intended evacuation behavior is becoming more and more important for community planners. My research investigates intended evacuation behavior due to hurricane risks, a behavioral survey of the six counties in GCA under the direction of two social scientists with extensive experience in survey research related to citizen and household response to emergencies and disasters. Respondents gave answers whether they would evacuate under both voluntary and mandatory evacuation orders. Bivariate probit models are used to investigate the subjective belief structure of whether or not the respondents are concerned about the hurricane, and the intended probability of evacuating as a function of risk perception, and a lot of demographic and socioeconomic variables (e.g., gender, military, age, length of residence, owning vehicles).
Resumo:
Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.
Resumo:
Recent studies on the economic status of women in Miami-Dade County (MDC) reveal an alarming rate of economic insecurity and significant obstacles for women to achieve economic security. Consistent barriers to women’s economic security affect not only the health and wellbeing of women and their families, but also economic prospects for the community. A key study reveals in Miami-Dade County, “Thirty-nine percent of single female-headed families with at least one child are living at or below the federal poverty level” and “over half of working women do not earn adequate income to cover their basic necessities” (Brion 2009, 1). Moreover, conventional measures of poverty do not adequately capture women’s struggles to support themselves and their families, nor do they document the numbers of women seeking basic self-sufficiency. Even though there is lack of accurate data on women in the county, which is a critical problem, there is also a dearth of social science research on existing efforts to enhance women’s economic security in Miami-Dade County. My research contributes to closing the information gap by examining the characteristics and strategies of women-led community development organizations (CDOs) in MDC, working to address women’s economic insecurity. The research is informed by a framework developed by Marilyn Gittell, who pioneered an approach to study women-led CDOs in the United States. On the basis of research in nine U.S. cities, she concluded that women-led groups increased community participation and “by creating community networks and civic action, they represent a model for community development efforts” (Gittell, et al. 2000, 123). My study documents the strategies and networks of women-led CDOs in MDC that prioritize women’s economic security. Their strategies are especially important during these times of economic recession and government reductions in funding towards social services. The focus of the research is women-led CDOs that work to improve social services access, economic opportunity, civic participation and capacity, and women’s rights. Although many women-led CDOs prioritize building social infrastructures that promote change, inequalities in economic and political status for women without economic security remain a challenge (Young 2004). My research supports previous studies by Gittell, et al., finding that women-led CDOs in Miami-Dade County have key characteristics of a model of community development efforts that use networking and collaboration to strengthen their broad, integrated approach. The resulting community partnerships, coupled with participation by constituents in the development process, build a foundation to influence policy decisions for social change. In addition, my findings show that women-led CDOs in Miami-Dade County have a major focus on alleviating poverty and economic insecurity, particularly that of women. Finally, it was found that a majority of the five organizations network transnationally, using lessons learned to inform their work of expanding the agency of their constituents and placing the economic empowerment of women as central in the process of family and community development.
Resumo:
This research is part of continued efforts to correlate the hydrology of East Fork Poplar Creek (EFPC) and Bear Creek (BC) with the long term distribution of mercury within the overland, subsurface, and river sub-domains. The main objective of this study was to add a sedimentation module (ECO Lab) capable of simulating the reactive transport mercury exchange mechanisms within sediments and porewater throughout the watershed. The enhanced model was then applied to a Total Maximum Daily Load (TMDL) mercury analysis for EFPC. That application used historical precipitation, groundwater levels, river discharges, and mercury concentrations data that were retrieved from government databases and input to the model. The model was executed to reduce computational time, predict flow discharges, total mercury concentration, flow duration and mercury mass rate curves at key monitoring stations under various hydrological and environmental conditions and scenarios. The computational results provided insight on the relationship between discharges and mercury mass rate curves at various stations throughout EFPC, which is important to best understand and support the management mercury contamination and remediation efforts within EFPC.
Resumo:
In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.
Resumo:
Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as f-test is performed during each node’s split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.
Resumo:
The purpose of this thesis was to examine the mediating effects of job-related negative emotions on the relationship between workplace aggression and outcomes. Additionally, the moderating effects of workplace social support and intensity of workplace aggression are considered. A total 321 of working individuals participated through an online survey. The results of this thesis suggest that job-related negative emotions are a mediator of the relationship between workplace aggression and outcomes, with full and partial mediation supported. Workplace social support was found to be a buffering variable in the relationship between workplace aggression and outcomes, regardless of the source of aggression (supervisor or co-worker) or the source of the social support. Finally, intensity of aggression was found to be a strong moderator of the relationship between workplace aggression and outcomes.
Resumo:
Matrix factorization (MF) has evolved as one of the better practice to handle sparse data in field of recommender systems. Funk singular value decomposition (SVD) is a variant of MF that exists as state-of-the-art method that enabled winning the Netflix prize competition. The method is widely used with modifications in present day research in field of recommender systems. With the potential of data points to grow at very high velocity, it is prudent to devise newer methods that can handle such data accurately as well as efficiently than Funk-SVD in the context of recommender system. In view of the growing data points, I propose a latent factor model that caters to both accuracy and efficiency by reducing the number of latent features of either users or items making it less complex than Funk-SVD, where latent features of both users and items are equal and often larger. A comprehensive empirical evaluation of accuracy on two publicly available, amazon and ml-100 k datasets reveals the comparable accuracy and lesser complexity of proposed methods than Funk-SVD.
Resumo:
Background Context Percutaneous vertebroplasty (PVP) is a minimally invasive surgical procedure and is frequently performed in humans who need surgical treatment of vertebral fractures. PVP involves cement injection into the vertebral body, thereby providing rapid and significant pain relief. Purpose The testing of novel biomaterials depends on suitable animal models. The aim of this study was to develop a reproducible and safe model of PVP in sheep. Study Design This study used ex vivo and in vivo large animal model study (Merino sheep). Methods Ex vivo vertebroplasty was performed through a bilateral modified parapedicular access in 24 ovine lumbar hemivertebrae, divided into four groups (n=6). Cerament (Bone Support, Lund, Sweden) was the control material. In the experimental group, a novel composite was tested—Spine-Ghost—which consisted of an alpha-calcium sulfate matrix enriched with micrometric particles of mesoporous bioactive glass. All vertebrae were assessed by micro-computed tomography (micro-CT) and underwent mechanical testing. For the in vivo study, 16 sheep were randomly allocated into control and experimental groups (n=8), and underwent PVP using the same bone cements. All vertebrae were assessed postmortem by micro-CT, histology, and reverse transcription-polymerase chain reaction (rt-PCR). This work has been supported by the European Commission under the 7th Framework Programme for collaborative projects (600,000–650,000 USD). Results In the ex vivo model, the average defect volume was 1,275.46±219.29 mm3. Adequate defect filling with cement was observed. No mechanical failure was observed under loads which were higher than physiological. In the in vivo study, cardiorespiratory distress was observed in two animals, and one sheep presented mild neurologic deficits in the hind limbs before recovering. Conclusions The model of PVP is considered suitable for preclinical in vivo studies, mimicking clinical application. All sheep recovered and completed a 6-month implantation period. There was no evidence of cement leakage into the vertebral foramen in the postmortem examination.
Resumo:
In an organisation any optimization process of its issues faces increasing challenges and requires new approaches to the organizational phenomenon. Indeed, in this work it is addressed the problematic of efficiency dynamics through intangible variables that may support a different view of the corporations. It focuses on the challenges that information management and the incorporation of context brings to competitiveness. Thus, in this work it is presented the analysis and development of an intelligent decision support system in terms of a formal agenda built on a Logic Programming based methodology to problem solving, complemented with an attitude to computing grounded on Artificial Neural Networks. The proposed model is in itself fairly precise, with an overall accuracy, sensitivity and specificity with values higher than 90 %. The proposed solution is indeed unique, catering for the explicit treatment of incomplete, unknown, or even self-contradictory information, either in a quantitative or qualitative arrangement.
Resumo:
Atrial fibrillation is associated with a five-fold increase in the risk of cerebrovascular events,being responsible of 15-18% of all strokes.The morphological and functional remodelling of the left atrium caused by atrial fibrillation favours blood stasis and, consequently, stroke risk. In this context, several clinical studies suggest that stroke risk stratification could be improved by using haemodynamic information on the left atrium (LA) and the left atrial appendage (LAA). The goal of this study was to develop a personalized computational fluid-dynamics (CFD) model of the left atrium which could clarify the haemodynamic implications of atrial fibrillation on a patient specific basis. The developed CFD model was first applied to better understand the role of LAA in stroke risk. Infact, the interplay of the LAA geometric parameters such as LAA length, tortuosity, surface area and volume with the fluid-dynamics parameters and the effects of the LAA closure have not been investigated. Results demonstrated the capabilities of the CFD model to reproduce the real physiological behaviour of the blood flow dynamics inside the LA and the LAA. Finally, we determined that the fluid-dynamics parameters enhanced in this research project could be used as new quantitative indexes to describe the different types of AF and open new scenarios for the patient-specific stroke risk stratification.