929 resultados para Based structure model
Resumo:
With the features of low-power and flexible networking capabilities IEEE 802.15.4 has been widely regarded as one strong candidate of communication technologies for wireless sensor networks (WSNs). It is expected that with an increasing number of deployments of 802.15.4 based WSNs, multiple WSNs could coexist with full or partial overlap in residential or enterprise areas. As WSNs are usually deployed without coordination, the communication could meet significant degradation with the 802.15.4 channel access scheme, which has a large impact on system performance. In this thesis we are motivated to investigate the effectiveness of 802.15.4 networks supporting WSN applications with various environments, especially when hidden terminals are presented due to the uncoordinated coexistence problem. Both analytical models and system level simulators are developed to analyse the performance of the random access scheme specified by IEEE 802.15.4 medium access control (MAC) standard for several network scenarios. The first part of the thesis investigates the effectiveness of single 802.15.4 network supporting WSN applications. A Markov chain based analytic model is applied to model the MAC behaviour of IEEE 802.15.4 standard and a discrete event simulator is also developed to analyse the performance and verify the proposed analytical model. It is observed that 802.15.4 networks could sufficiently support most WSN applications with its various functionalities. After the investigation of single network, the uncoordinated coexistence problem of multiple 802.15.4 networks deployed with communication range fully or partially overlapped are investigated in the next part of the thesis. Both nonsleep and sleep modes are investigated with different channel conditions by analytic and simulation methods to obtain the comprehensive performance evaluation. It is found that the uncoordinated coexistence problem can significantly degrade the performance of 802.15.4 networks, which is unlikely to satisfy the QoS requirements for many WSN applications. The proposed analytic model is validated by simulations which could be used to obtain the optimal parameter setting before WSNs deployments to eliminate the interference risks.
Resumo:
IEEE 802.15.4 standard is a relatively new standard designed for low power low data rate wireless sensor networks (WSN), which has a wide range of applications, e.g., environment monitoring, e-health, home and industry automation. In this paper, we investigate the problems of hidden devices in coverage overlapped IEEE 802.15.4 WSNs, which is likely to arise when multiple 802.15.4 WSNs are deployed closely and independently. We consider a typical scenario of two 802.15.4 WSNs with partial coverage overlapping and propose a Markov-chain based analytical model to reveal the performance degradation due to the hidden devices from the coverage overlapping. Impacts of the hidden devices and network sleeping modes on saturated throughput and energy consumption are modeled. The analytic model is verified by simulations, which can provide the insights to network design and planning when multiple 802.15.4 WSNs are deployed closely. © 2013 IEEE.
Resumo:
Timely warning of the public during large scale emergencies is essential to ensure safety and save lives. This ongoing study proposes an agent-based simulation model to simulate the warning message dissemination among the public considering both official channels and unofficial channels The proposed model was developed in NetLogo software for a hypothetical area, and requires input parameters such as effectiveness of each official source (%), estimated time to begin informing others, estimated time to inform others and estimated percentage of people (who do not relay the message). This paper demonstrates a means of factoring the behaviour of the public as informants into estimating the effectiveness of warningdissemination during large scale emergencies. The model provides a tool for the practitioner to test the potential impact of the informal channels on the overall warning time and sensitivity of the modelling parameters. The tool would help the practitioners to persuade evacuees to disseminate the warning message informing others similar to the ’Run to thy neighbour campaign conducted by the Red cross.
Resumo:
This article reports on a conversation between 12 lesbian, gay, bisexual and transgender (LGBT) psychologists at the first international LGBT Psychology Summer Institute at the University of Michigan in August 2009. Participants discuss how their work in LGBT psychology is affected by national policy, funding and academic contexts and the transnational influence of the US-based stigma model of LGBT psychology. The challenges and possibilities posed by internationalism are discussed with reference to the dominance of the United States, the cultural limits of terms such as 'lesbian, gay, bisexual and transgender', intergenerational communication between researchers and the role of events such as the Summer Institute in creating an international community of LGBT psychologists. © 2010 Taylor & Francis.
Resumo:
The value of knowing about data availability and system accessibility is analyzed through theoretical models of Information Economics. When a user places an inquiry for information, it is important for the user to learn whether the system is not accessible or the data is not available, rather than not have any response. In reality, various outcomes can be provided by the system: nothing will be displayed to the user (e.g., a traffic light that does not operate, a browser that keeps browsing, a telephone that does not answer); a random noise will be displayed (e.g., a traffic light that displays random signals, a browser that provides disorderly results, an automatic voice message that does not clarify the situation); a special signal indicating that the system is not operating (e.g., a blinking amber indicating that the traffic light is down, a browser responding that the site is unavailable, a voice message regretting to tell that the service is not available). This article develops a model to assess the value of the information for the user in such situations by employing the information structure model prevailing in Information Economics. Examples related to data accessibility in centralized and in distributed systems are provided for illustration.
Resumo:
The paper presents a case study of geo-monitoring a region consisting in the capturing and encoding of human expertise into a knowledge-based system. As soon as the maps have been processed, the data patterns are detected using knowledge-based agents for the harvest prognosis.
Resumo:
Classroom teachers are often required to implement new procedures or practices in response to local or federal education mandates. Attempts to implement innovations, often do not take into account the personal side of change; the perceptions, concerns and needs of those required to implement the innovation. One innovation that was required by the School Board of Broward County, Florida for all elementary classroom teachers was the implementation of Literacy Folders. ^ This study attempted to address the personal side of change by identifying teacher concerns during the implementation of Literacy Folders in a select elementary school in Broward County Florida. The Concerns Based Adoption model (CBAM) for change was used as the conceptual framework for this qualitative case study. ^ Sources of data for this study included participant interviews, observations and analysis of documents. Informal conversations with the participants and unscheduled classroom visits were also sources of data. Seven classroom teachers were interviewed using a predesigned interview guide developed based on the CBAM of change, specifically the Stages of Concern Dimension. Participant responses were coded into two categories, (a) recollections of past perceptions, and (b) present perceptions regarding the innovation. ^ Data analysis resulted in the emergence of one major theme and two subordinate themes. The themes were related to time and purpose of the innovation. The researcher also discovered that the participants exhibited responses typically representative of the CBAM for individuals who are in the process of adjusting to a new innovation. ^ Recommendations based on participant concerns are made for improving the implementation of the innovation. Recommendations for alternatives to the innovation and suggestions regarding areas for further research in the field of educational change are also made. ^
Resumo:
The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^
Resumo:
In recent years, wireless communication infrastructures have been widely deployed for both personal and business applications. IEEE 802.11 series Wireless Local Area Network (WLAN) standards attract lots of attention due to their low cost and high data rate. Wireless ad hoc networks which use IEEE 802.11 standards are one of hot spots of recent network research. Designing appropriate Media Access Control (MAC) layer protocols is one of the key issues for wireless ad hoc networks. ^ Existing wireless applications typically use omni-directional antennas. When using an omni-directional antenna, the gain of the antenna in all directions is the same. Due to the nature of the Distributed Coordination Function (DCF) mechanism of IEEE 802.11 standards, only one of the one-hop neighbors can send data at one time. Nodes other than the sender and the receiver must be either in idle or listening state, otherwise collisions could occur. The downside of the omni-directionality of antennas is that the spatial reuse ratio is low and the capacity of the network is considerably limited. ^ It is therefore obvious that the directional antenna has been introduced to improve spatial reutilization. As we know, a directional antenna has the following benefits. It can improve transport capacity by decreasing interference of a directional main lobe. It can increase coverage range due to a higher SINR (Signal Interference to Noise Ratio), i.e., with the same power consumption, better connectivity can be achieved. And the usage of power can be reduced, i.e., for the same coverage, a transmitter can reduce its power consumption. ^ To utilizing the advantages of directional antennas, we propose a relay-enabled MAC protocol. Two relay nodes are chosen to forward data when the channel condition of direct link from the sender to the receiver is poor. The two relay nodes can transfer data at the same time and a pipelined data transmission can be achieved by using directional antennas. The throughput can be improved significant when introducing the relay-enabled MAC protocol. ^ Besides the strong points, directional antennas also have some explicit drawbacks, such as the hidden terminal and deafness problems and the requirements of retaining location information for each node. Therefore, an omni-directional antenna should be used in some situations. The combination use of omni-directional and directional antennas leads to the problem of configuring heterogeneous antennas, i e., given a network topology and a traffic pattern, we need to find a tradeoff between using omni-directional and using directional antennas to obtain a better network performance over this configuration. ^ Directly and mathematically establishing the relationship between the network performance and the antenna configurations is extremely difficult, if not intractable. Therefore, in this research, we proposed several clustering-based methods to obtain approximate solutions for heterogeneous antennas configuration problem, which can improve network performance significantly. ^ Our proposed methods consist of two steps. The first step (i.e., clustering links) is to cluster the links into different groups based on the matrix-based system model. After being clustered, the links in the same group have similar neighborhood nodes and will use the same type of antenna. The second step (i.e., labeling links) is to decide the type of antenna for each group. For heterogeneous antennas, some groups of links will use directional antenna and others will adopt omni-directional antenna. Experiments are conducted to compare the proposed methods with existing methods. Experimental results demonstrate that our clustering-based methods can improve the network performance significantly. ^
Resumo:
This dissertation aims to improve the performance of existing assignment-based dynamic origin-destination (O-D) matrix estimation models to successfully apply Intelligent Transportation Systems (ITS) strategies for the purposes of traffic congestion relief and dynamic traffic assignment (DTA) in transportation network modeling. The methodology framework has two advantages over the existing assignment-based dynamic O-D matrix estimation models. First, it combines an initial O-D estimation model into the estimation process to provide a high confidence level of initial input for the dynamic O-D estimation model, which has the potential to improve the final estimation results and reduce the associated computation time. Second, the proposed methodology framework can automatically convert traffic volume deviation to traffic density deviation in the objective function under congested traffic conditions. Traffic density is a better indicator for traffic demand than traffic volume under congested traffic condition, thus the conversion can contribute to improving the estimation performance. The proposed method indicates a better performance than a typical assignment-based estimation model (Zhou et al., 2003) in several case studies. In the case study for I-95 in Miami-Dade County, Florida, the proposed method produces a good result in seven iterations, with a root mean square percentage error (RMSPE) of 0.010 for traffic volume and a RMSPE of 0.283 for speed. In contrast, Zhou's model requires 50 iterations to obtain a RMSPE of 0.023 for volume and a RMSPE of 0.285 for speed. In the case study for Jacksonville, Florida, the proposed method reaches a convergent solution in 16 iterations with a RMSPE of 0.045 for volume and a RMSPE of 0.110 for speed, while Zhou's model needs 10 iterations to obtain the best solution, with a RMSPE of 0.168 for volume and a RMSPE of 0.179 for speed. The successful application of the proposed methodology framework to real road networks demonstrates its ability to provide results both with satisfactory accuracy and within a reasonable time, thus establishing its potential usefulness to support dynamic traffic assignment modeling, ITS systems, and other strategies.
Resumo:
An iterative travel time forecasting scheme, named the Advanced Multilane Prediction based Real-time Fastest Path (AMPRFP) algorithm, is presented in this dissertation. This scheme is derived from the conventional kernel estimator based prediction model by the association of real-time nonlinear impacts that caused by neighboring arcs’ traffic patterns with the historical traffic behaviors. The AMPRFP algorithm is evaluated by prediction of the travel time of congested arcs in the urban area of Jacksonville City. Experiment results illustrate that the proposed scheme is able to significantly reduce both the relative mean error (RME) and the root-mean-squared error (RMSE) of the predicted travel time. To obtain high quality real-time traffic information, which is essential to the performance of the AMPRFP algorithm, a data clean scheme enhanced empirical learning (DCSEEL) algorithm is also introduced. This novel method investigates the correlation between distance and direction in the geometrical map, which is not considered in existing fingerprint localization methods. Specifically, empirical learning methods are applied to minimize the error that exists in the estimated distance. A direction filter is developed to clean joints that have negative influence to the localization accuracy. Synthetic experiments in urban, suburban and rural environments are designed to evaluate the performance of DCSEEL algorithm in determining the cellular probe’s position. The results show that the cellular probe’s localization accuracy can be notably improved by the DCSEEL algorithm. Additionally, a new fast correlation technique for overcoming the time efficiency problem of the existing correlation algorithm based floating car data (FCD) technique is developed. The matching process is transformed into a 1-dimensional (1-D) curve matching problem and the Fast Normalized Cross-Correlation (FNCC) algorithm is introduced to supersede the Pearson product Moment Correlation Co-efficient (PMCC) algorithm in order to achieve the real-time requirement of the FCD method. The fast correlation technique shows a significant improvement in reducing the computational cost without affecting the accuracy of the matching process.
Resumo:
The major objectives of this dissertation were to develop optimal spatial techniques to model the spatial-temporal changes of the lake sediments and their nutrients from 1988 to 2006, and evaluate the impacts of the hurricanes occurred during 1998–2006. Mud zone reduced about 10.5% from 1988 to 1998, and increased about 6.2% from 1998 to 2006. Mud areas, volumes and weight were calculated using validated Kriging models. From 1988 to 1998, mud thicknesses increased up to 26 cm in the central lake area. The mud area and volume decreased about 13.78% and 10.26%, respectively. From 1998 to 2006, mud depths declined by up to 41 cm in the central lake area, mud volume reduced about 27%. Mud weight increased up to 29.32% from 1988 to 1998, but reduced over 20% from 1998 to 2006. The reduction of mud sediments is likely due to re-suspension and redistribution by waves and currents produced by large storm events, particularly Hurricanes Frances and Jeanne in 2004 and Wilma in 2005. Regression, kriging, geographically weighted regression (GWR) and regression-kriging models have been calibrated and validated for the spatial analysis of the sediments TP and TN of the lake. GWR models provide the most accurate predictions for TP and TN based on model performance and error analysis. TP values declined from an average of 651 to 593 mg/kg from 1998 to 2006, especially in the lake’s western and southern regions. From 1988 to 1998, TP declined in the northern and southern areas, and increased in the central-western part of the lake. The TP weights increased about 37.99%–43.68% from 1988 to 1998 and decreased about 29.72%–34.42% from 1998 to 2006. From 1988 to 1998, TN decreased in most areas, especially in the northern and southern lake regions; western littoral zone had the biggest increase, up to 40,000 mg/kg. From 1998 to 2006, TN declined from an average of 9,363 to 8,926 mg/kg, especially in the central and southern regions. The biggest increases occurred in the northern lake and southern edge areas. TN weights increased about 15%–16.2% from 1988 to 1998, and decreased about 7%–11% from 1998 to 2006.
Resumo:
Shallow marine ecosystems are experiencing significant environmental alterations as a result of changing climate and increasing human activities along coasts. Intensive urbanization of the southeast Florida coast and intensification of climate change over the last few centuries changed the character of coastal ecosystems in the semi-enclosed Biscayne Bay, Florida. In order to develop management policies for the Bay, it is vital to obtain reliable scientific evidence of past ecological conditions. The long-term records of subfossil diatoms obtained from No Name Bank and Featherbed Bank in the Central Biscayne Bay, and from the Card Sound Bank in the neighboring Card Sound, were used to study the magnitude of the environmental change caused by climate variability and water management over the last ~ 600 yr. Analyses of these records revealed that the major shifts in the diatom assemblage structures at No Name Bank occurred in 1956, at Featherbed Bank in 1966, and at Card Sound Bank in 1957. Smaller magnitude shifts were also recorded at Featherbed Bank in 1893, 1942, 1974 and 1983. Most of these changes coincided with severe drought periods that developed during the cold phases of El Niño Southern Oscillation (ENSO), Atlantic Multidecadal Oscillation (AMO) and Pacific Decadal Oscillation (PDO), or when AMO was in warm phase and PDO was in the cold phase. Only the 1983 change coincided with an unusually wet period that developed during the warm phases of ENSO and PDO. Quantitative reconstructions of salinity using the weighted averaging partial least squares (WA-PLS) diatom-based salinity model revealed a gradual increase in salinity at the three coring locations over the last ~ 600 yr, which was primarily caused by continuously rising sea level and in the last several decades also by the reduction of the amount of freshwater inflow from the mainland. Concentration of sediment total nitrogen (TN), total phosphorus (TP) and total organic carbon (TOC) increased in the second half of the 20th century, which coincided with the construction of canals, landfills, marinas and water treatment plants along the western margin of Biscayne Bay. Increased magnitude and rate of the diatom assemblage restructuring in the mid- and late-1900s, suggest that large environmental changes are occurring more rapidly now than in the past.
Resumo:
Shallow marine ecosystems are experiencing significant environmental alterations as a result of changing climate and increasing human activities along coasts. Intensive urbanization of the southeast Florida coast and intensification of climate change over the last few centuries changed the character of coastal ecosystems in the semi-enclosed Biscayne Bay, Florida. In order to develop management policies for the Bay, it is vital to obtain reliable scientific evidence of past ecological conditions. The long-term records of subfossil diatoms obtained from No Name Bank and Featherbed Bank in the Central Biscayne Bay, and from the Card Sound Bank in the neighboring Card Sound, were used to study the magnitude of the environmental change caused by climate variability and water management over the last ~ 600 yr. Analyses of these records revealed that the major shifts in the diatom assemblage structures at No Name Bank occurred in 1956, at Featherbed Bank in 1966, and at Card Sound Bank in 1957. Smaller magnitude shifts were also recorded at Featherbed Bank in 1893, 1942, 1974 and 1983. Most of these changes coincided with severe drought periods that developed during the cold phases of El Niño Southern Oscillation (ENSO), Atlantic Multidecadal Oscillation (AMO) and Pacific Decadal Oscillation (PDO), or when AMO was in warm phase and PDO was in the cold phase. Only the 1983 change coincided with an unusually wet period that developed during the warm phases of ENSO and PDO. Quantitative reconstructions of salinity using the weighted averaging partial least squares (WA-PLS) diatom-based salinity model revealed a gradual increase in salinity at the three coring locations over the last ~ 600 yr, which was primarily caused by continuously rising sea level and in the last several decades also by the reduction of the amount of freshwater inflow from the mainland. Concentration of sediment total nitrogen (TN), total phosphorus (TP) and total organic carbon (TOC) increased in the second half of the 20th century, which coincided with the construction of canals, landfills, marinas and water treatment plants along the western margin of Biscayne Bay. Increased magnitude and rate of the diatom assemblage restructuring in the mid- and late-1900s, suggest that large environmental changes are occurring more rapidly now than in the past.
Resumo:
High dependability, availability and fault-tolerance are open problems in Service-Oriented Architecture (SOA). The possibility of generating software applications by integrating services from heterogeneous domains, in a reliable way, makes worthwhile to face the challenges inherent to this paradigm. In order to ensure quality in service compositions, some research efforts propose the adoption of verification techniques to identify and correct errors. In this context, exception handling is a powerful mechanism to increase SOA quality. Several research works are concerned with mechanisms for exception propagation on web services, implemented in many languages and frameworks. However, to the extent of our knowledge, no works found evaluates these mechanisms in SOA with regard to the .NET framework. The main contribution of this paper is to evaluate and to propose exception propagation mechanisms in SOA to applications developed within the .NET framework. In this direction, this work: (i)extends a previous study, showing the need to propose a solution to the exception propagation in SOA to applications developed in .NET, and (ii) show a solution, based in model obtained from the results found in (i) and that will be applied in real cases through of faults injections and AOP techniques.