933 resultados para Intuitive Expertise


Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the commoditization of sensing, actuation and communication hardware increases, so does the potential for dynamically tasked sense and respond networked systems (i.e., Sensor Networks or SNs) to replace existing disjoint and inflexible special-purpose deployments (closed-circuit security video, anti-theft sensors, etc.). While various solutions have emerged to many individual SN-centric challenges (e.g., power management, communication protocols, role assignment), perhaps the largest remaining obstacle to widespread SN deployment is that those who wish to deploy, utilize, and maintain a programmable Sensor Network lack the programming and systems expertise to do so. The contributions of this thesis centers on the design, development and deployment of the SN Workbench (snBench). snBench embodies an accessible, modular programming platform coupled with a flexible and extensible run-time system that, together, support the entire life-cycle of distributed sensory services. As it is impossible to find a one-size-fits-all programming interface, this work advocates the use of tiered layers of abstraction that enable a variety of high-level, domain specific languages to be compiled to a common (thin-waist) tasking language; this common tasking language is statically verified and can be subsequently re-translated, if needed, for execution on a wide variety of hardware platforms. snBench provides: (1) a common sensory tasking language (Instruction Set Architecture) powerful enough to express complex SN services, yet simple enough to be executed by highly constrained resources with soft, real-time constraints, (2) a prototype high-level language (and corresponding compiler) to illustrate the utility of the common tasking language and the tiered programming approach in this domain, (3) an execution environment and a run-time support infrastructure that abstract a collection of heterogeneous resources into a single virtual Sensor Network, tasked via this common tasking language, and (4) novel formal methods (i.e., static analysis techniques) that verify safety properties and infer implicit resource constraints to facilitate resource allocation for new services. This thesis presents these components in detail, as well as two specific case-studies: the use of snBench to integrate physical and wireless network security, and the use of snBench as the foundation for semester-long student projects in a graduate-level Software Engineering course.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Localization is essential feature for many mobile wireless applications. Data collected from applications such as environmental monitoring, package tracking or position tracking has no meaning without knowing the location of this data. Other applications have location information as a building block for example, geographic routing protocols, data dissemination protocols and location-based services such as sensing coverage. Many of the techniques have the trade-off among many features such as deployment of special hardware, level of accuracy and computation power. In this paper, we present an algorithm that extracts location constraints from the connectivity information. Our solution, which does not require any special hardware and a small number of landmark nodes, uses two types of location constraints. The spatial constraints derive the estimated locations observing which nodes are within communication range of each other. The temporal constraints refine the areas, computed by the spatial constraints, using properties of time and space extracted from a contact trace. The intuition of the temporal constraints is to limit the possible locations that a node can be using its previous and future locations. To quantify this intuitive improvement in refine the nodes estimated areas adding temporal information, we performed simulations using synthetic and real contact traces. The results show this improvement and also the difficulties of using real traces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A common assumption made in traffic matrix (TM) modeling and estimation is independence of a packet's network ingress and egress. We argue that in real IP networks, this assumption should not and does not hold. The fact that most traffic consists of two-way exchanges of packets means that traffic streams flowing in opposite directions at any point in the network are not independent. In this paper we propose a model for traffic matrices based on independence of connections rather than packets. We argue that the independent connection (IC) model is more intuitive, and has a more direct connection to underlying network phenomena than the gravity model. To validate the IC model, we show that it fits real data better than the gravity model and that it works well as a prior in the TM estimation problem. We study the model's parameters empirically and identify useful stability properties. This justifies the use of the simpler versions of the model for TM applications. To illustrate the utility of the model we focus on two such applications: synthetic TM generation and TM estimation. To the best of our knowledge this is the first traffic matrix model that incorporates properties of bidirectional traffic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a road network, cyclists are the group exposed to the maximum amount of risk. Route choice of a cyclist is often based on level of expertise, perceived or actual road risks, personal decisions, weather conditions and a number of other factors. Consequently, cycling tends to be the only significant travel mode where optimised route choice is not based on least-path or least-time. This paper presents an Android platform based mobile-app for personalised route planning of cyclists in Dublin. The mobile-app, apart from its immediate advantage to the cyclists, acts as the departure point for a number of research projects and aids in establishing some critical calibration values for the cycling network in Dublin. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is concerned with inductive charging of electric vehicle batteries. Rectified power form the 50/60 Hz utility feeds a dc-ac converter which delivers high-frequency ac power to the electric vehicle inductive coupling inlet. The inlet configuration has been defined by the Society of Automotive Engineers in Recommended Practice J-1773. This thesis studies converter topologies related to the series resonant converter. When coupled to the vehicle inlet, the frequency-controlled series-resonant converter results in a capacitively-filtered series-parallel LCLC (SP-LCLC) resonant converter topology with zero voltage switching and many other desirable features. A novel time-domain transformation analysis, termed Modal Analysis, is developed, using a state variable transformation, to analyze and characterize this multi-resonant fourth-orderconverter. Next, Fundamental Mode Approximation (FMA) Analysis, based on a voltage-source model of the load, and its novel extension, Rectifier-Compensated FMA (RCFMA) Analysis, are developed and applied to the SP-LCLC converter. The RCFMA Analysis is a simpler and more intuitive analysis than the Modal Analysis, and provides a relatively accurate closed-form solution for the converter behavior. Phase control of the SP-LCLC converter is investigated as a control option. FMA and RCFMA Analyses are used for detailed characterization. The analyses identify areas of operation, which are also validated experimentally, where it is advantageous to phase control the converter. A novel hybrid control scheme is proposed which integrates frequency and phase control and achieves reduced operating frequency range and improved partial-load efficiency. The phase-controlled SP-LCLC converter can also be configured with a parallel load and is an excellent option for the application. The resulting topology implements soft-switching over the entire load range and has high full-load and partial-load efficiencies. RCFMA Analysis is used to analyze and characterize the new converter topology, and good correlation is shown with experimental results. Finally, a novel single-stage power-factor-corrected ac-dc converter is introduced, which uses the current-source characteristic of the SP-LCLC topology to provide power factor correction over a wide output power range from zero to full load. This converter exhibits all the advantageous characteristics of its dc-dc counterpart, with a reduced parts count and cost. Simulation and experimental results verify the operation of the new converter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Can my immediate physical environment affect how I feel? The instinctive answer to this question must be a resounding “yes”. What might seem a throwaway remark is increasingly borne out by research in environmental and behavioural psychology, and in the more recent discipline of Evidence-Based Design. Research outcomes are beginning to converge with findings in neuroscience and neurophysiology, as we discover more about how the human brain and body functions, and reacts to environmental stimuli. What we see, hear, touch, and sense affects each of us psychologically and, by extension, physically, on a continual basis. The physical characteristics of our daily environment thus have the capacity to profoundly affect all aspects of our functioning, from biological systems to cognitive ability. This has long been understood on an intuitive basis, and utilised on a more conscious basis by architects and other designers. Recent research in evidence-based design, coupled with advances in neurophysiology, confirm what have been previously held as commonalities, but also illuminate an almost frightening potential to do enormous good, or alternatively, terrible harm, by virtue of how we make our everyday surroundings. The thesis adopts a design methodology in its approach to exploring the potential use of wireless sensor networks in environments for elderly people. Vitruvian principles of “commodity, firmness and delight” inform the research process and become embedded in the final design proposals and research conclusions. The issue of person-environment fit becomes a key principle in describing a model of continuously-evolving responsive architecture which makes the individual user its focus, with the intention of promoting wellbeing. The key research questions are: What are the key system characteristics of an adaptive therapeutic single-room environment? How can embedded technologies be utilised to maximise the adaptive and therapeutic aspects of the personal life-space of an elderly person with dementia?.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Future high speed communications networks will transmit data predominantly over optical fibres. As consumer and enterprise computing will remain the domain of electronics, the electro-optical conversion will get pushed further downstream towards the end user. Consequently, efficient tools are needed for this conversion and due to many potential advantages, including low cost and high output powers, long wavelength Vertical Cavity Surface Emitting Lasers (VCSELs) are a viable option. Drawbacks, such as broader linewidths than competing options, can be mitigated through the use of additional techniques such as Optical Injection Locking (OIL) which can require significant expertise and expensive equipment. This thesis addresses these issues by removing some of the experimental barriers to achieving performance increases via remote OIL. Firstly, numerical simulations of the phase and the photon and carrier numbers of an OIL semiconductor laser allowed the classification of the stable locking phase limits into three distinct groups. The frequency detuning of constant phase values (ø) was considered, in particular ø = 0 where the modulation response parameters were shown to be independent of the linewidth enhancement factor, α. A new method to estimate α and the coupling rate in a single experiment was formulated. Secondly, a novel technique to remotely determine the locked state of a VCSEL based on voltage variations of 2mV−30mV during detuned injection has been developed which can identify oscillatory and locked states. 2D & 3D maps of voltage, optical and electrical spectra illustrate corresponding behaviours. Finally, the use of directly modulated VCSELs as light sources for passive optical networks was investigated by successful transmission of data at 10 Gbit/s over 40km of single mode fibre (SMF) using cost effective electronic dispersion compensation to mitigate errors due to wavelength chirp. A widely tuneable MEMS-VCSEL was established as a good candidate for an externally modulated colourless source after a record error free transmission at 10 Gbit/s over 50km of SMF across a 30nm single mode tuning range. The ability to remotely set the emission wavelength using the novel methods developed in this thesis was demonstrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The transition to becoming a leader is perhaps the least understood and most difficult in business. This Portfolio of Exploration examines the development of conscious awareness and meaning complexity as key transformational requirements to operate competently at leadership level and to succeed in a work environment characterised by change and complexity. It recognises that developing executive leadership capability is not just an issue of personality increasing what we know or expertise. It requires development of complexity in terms of how we know ourselves, relate to others, construe leadership and organisation, problem solve in business and understand the world as a whole. The exploration is grounded in the theory of adult mental development as outlined by Robert Kegan (1982, 1994) and in his collaborations with Lisa Laskow Lahey (2001, 2009). The theory points to levels of consciousness which impact on how we make meaning of and experience the world around us and respond to it. Critically it also points to transformational processes which enable us to evolve how we make meaning of our world as a means to close the mismatch between the demands of this world and our ability to cope. The exploration is laid out in three stages. Using Kegan’s (1982, 1994) theory as a framework it begins with a reflection of my career to surface how I made meaning of banking, management and subsequently leadership. In stage two I engage with a range of source thinkers in the areas of leadership, decision making, business, organisation, growth and complexity in a transformational process of developing greater conscious and complex understanding of organisational leadership (also recognising ever increasing complexity in the world). Finally, in stage three, I explore how qualitative changes as a result of this transformational effort have benefitted my professional, leadership and organisational capabilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent years have witnessed a rapid growth in the demand for streaming video over the Internet and mobile networks, exposes challenges in coping with heterogeneous devices and varying network throughput. Adaptive schemes, such as scalable video coding, are an attractive solution but fare badly in the presence of packet losses. Techniques that use description-based streaming models, such as multiple description coding (MDC), are more suitable for lossy networks, and can mitigate the effects of packet loss by increasing the error resilience of the encoded stream, but with an increased transmission byte cost. In this paper, we present our adaptive scalable streaming technique adaptive layer distribution (ALD). ALD is a novel scalable media delivery technique that optimises the tradeoff between streaming bandwidth and error resiliency. ALD is based on the principle of layer distribution, in which the critical stream data are spread amongst all packets, thus lessening the impact on quality due to network losses. Additionally, ALD provides a parameterised mechanism for dynamic adaptation of the resiliency of the scalable video. The Subjective testing results illustrate that our techniques and models were able to provide levels of consistent high-quality viewing, with lower transmission cost, relative to MDC, irrespective of clip type. This highlights the benefits of selective packetisation in addition to intuitive encoding and transmission.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this investigation, we examined 256 cases of financial failure and fraud in Vietnam’s chaotic years from 2007 to 2013. Categorical data analyses suggest that the rent-seeking approach, or resource-based orientation, alone does not help explain the outcome of a business intention while the association between Orientation and Approach is the best-fit predictor. Rampant financial collapse not only increases the cost of funds but also erodes trust in the economy. Entrepreneurship development and creativity capacity building, in light of this, are necessary to improve socio-economic conditions and the environment. In this manuscript, we also introduce intuitive and cognitive factors to predict ex-ante outcome of a financing scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. METHODS/PRINCIPAL FINDINGS: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of "what if" situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. CONCLUSION/SIGNIFICANCE: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gemstone Team Vision

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gemstone Team Future Firefighting Advancements

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An enterprise information system (EIS) is an integrated data-applications platform characterized by diverse, heterogeneous, and distributed data sources. For many enterprises, a number of business processes still depend heavily on static rule-based methods and extensive human expertise. Enterprises are faced with the need for optimizing operation scheduling, improving resource utilization, discovering useful knowledge, and making data-driven decisions.

This thesis research is focused on real-time optimization and knowledge discovery that addresses workflow optimization, resource allocation, as well as data-driven predictions of process-execution times, order fulfillment, and enterprise service-level performance. In contrast to prior work on data analytics techniques for enterprise performance optimization, the emphasis here is on realizing scalable and real-time enterprise intelligence based on a combination of heterogeneous system simulation, combinatorial optimization, machine-learning algorithms, and statistical methods.

On-demand digital-print service is a representative enterprise requiring a powerful EIS.We use real-life data from Reischling Press, Inc. (RPI), a digit-print-service provider (PSP), to evaluate our optimization algorithms.

In order to handle the increase in volume and diversity of demands, we first present a high-performance, scalable, and real-time production scheduling algorithm for production automation based on an incremental genetic algorithm (IGA). The objective of this algorithm is to optimize the order dispatching sequence and balance resource utilization. Compared to prior work, this solution is scalable for a high volume of orders and it provides fast scheduling solutions for orders that require complex fulfillment procedures. Experimental results highlight its potential benefit in reducing production inefficiencies and enhancing the productivity of an enterprise.

We next discuss analysis and prediction of different attributes involved in hierarchical components of an enterprise. We start from a study of the fundamental processes related to real-time prediction. Our process-execution time and process status prediction models integrate statistical methods with machine-learning algorithms. In addition to improved prediction accuracy compared to stand-alone machine-learning algorithms, it also performs a probabilistic estimation of the predicted status. An order generally consists of multiple series and parallel processes. We next introduce an order-fulfillment prediction model that combines advantages of multiple classification models by incorporating flexible decision-integration mechanisms. Experimental results show that adopting due dates recommended by the model can significantly reduce enterprise late-delivery ratio. Finally, we investigate service-level attributes that reflect the overall performance of an enterprise. We analyze and decompose time-series data into different components according to their hierarchical periodic nature, perform correlation analysis,

and develop univariate prediction models for each component as well as multivariate models for correlated components. Predictions for the original time series are aggregated from the predictions of its components. In addition to a significant increase in mid-term prediction accuracy, this distributed modeling strategy also improves short-term time-series prediction accuracy.

In summary, this thesis research has led to a set of characterization, optimization, and prediction tools for an EIS to derive insightful knowledge from data and use them as guidance for production management. It is expected to provide solutions for enterprises to increase reconfigurability, accomplish more automated procedures, and obtain data-driven recommendations or effective decisions.