44 resultados para context-aware applications


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to investigate the biometrics technologies adopted by hotels and the perception of hotel managers toward biometric technology applications. A descriptive, cross sectional survey was developed based on extensive review of literature and expert opinions. The population for this survey was property level executive managers in the U.S. hotels. Members of American Hotel and Lodging Association (AHLA) were selected as the target population for this study. The most frequent use of biometric technology is by hotel employees in the form of fingerprint scanning. Cost still seems to be one of the major barriers to adoption of biometric technology applications. The findings of this study showed that there definitely is a future in using biometric technology applications in hotels in the future, however, according to hoteliers; neither guests nor hoteliers are ready for it fully.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, energy efficiency or green IT has become a hot issue for many IT infrastructures as they attempt to utilize energy-efficient strategies in their enterprise IT systems in order to minimize operational costs. Networking devices are shared resources connecting important IT infrastructures, especially in a data center network they are always operated 24/7 which consume a huge amount of energy, and it has been obviously shown that this energy consumption is largely independent of the traffic through the devices. As a result, power consumption in networking devices is becoming more and more a critical problem, which is of interest for both research community and general public. Multicast benefits group communications in saving link bandwidth and improving application throughput, both of which are important for green data center. In this paper, we study the deployment strategy of multicast switches in hybrid mode in energy-aware data center network: a case of famous fat-tree topology. The objective is to find the best location to deploy multicast switch not only to achieve optimal bandwidth utilization but also to minimize power consumption. We show that it is possible to easily achieve nearly 50% of energy consumption after applying our proposed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current methods of understanding microbiome composition and structure rely on accurately estimating the number of distinct species and their relative abundance. Most of these methods require an efficient PCR whose forward and reverse primers bind well to the same, large number of identifiable species, and produce amplicons that are unique. It is therefore not surprising that currently used universal primers designed many years ago are not as efficient and fail to bind to recently cataloged species. We propose an automated general method of designing PCR primer pairs that abide by primer design rules and uses current sequence database as input. Since the method is automated, primers can be designed for targeted microbial species or updated as species are added or deleted from the database. In silico experiments and laboratory experiments confirm the efficacy of the newly designed primers for metagenomics applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been an enormous growth of location-aware devices, such as GPS embedded cell phones, mobile sensors and radio-frequency identification tags. The age of combining sensing, processing and communication in one device, gives rise to a vast number of applications leading to endless possibilities and a realization of mobile Wireless Sensor Network (mWSN) applications. As computing, sensing and communication become more ubiquitous, trajectory privacy becomes a critical piece of information and an important factor for commercial success. While on the move, sensor nodes continuously transmit data streams of sensed values and spatiotemporal information, known as ``trajectory information". If adversaries can intercept this information, they can monitor the trajectory path and capture the location of the source node. ^ This research stems from the recognition that the wide applicability of mWSNs will remain elusive unless a trajectory privacy preservation mechanism is developed. The outcome seeks to lay a firm foundation in the field of trajectory privacy preservation in mWSNs against external and internal trajectory privacy attacks. First, to prevent external attacks, we particularly investigated a context-based trajectory privacy-aware routing protocol to prevent the eavesdropping attack. Traditional shortest-path oriented routing algorithms give adversaries the possibility to locate the target node in a certain area. We designed the novel privacy-aware routing phase and utilized the trajectory dissimilarity between mobile nodes to mislead adversaries about the location where the message started its journey. Second, to detect internal attacks, we developed a software-based attestation solution to detect compromised nodes. We created the dynamic attestation node chain among neighboring nodes to examine the memory checksum of suspicious nodes. The computation time for memory traversal had been improved compared to the previous work. Finally, we revisited the trust issue in trajectory privacy preservation mechanism designs. We used Bayesian game theory to model and analyze cooperative, selfish and malicious nodes' behaviors in trajectory privacy preservation activities.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis was to compare graphene nanoplatelets (GNP) and WS2 as solid lubricant additives to aluminum in order to reduce friction and wear. The central hypothesis of this work relied on lubricating properties of 2D materials, which consist layers that slip under a shear force. Two aluminum composites were made (Al-2 vol.% GNP and Al-2 vol.% WS2) by spark plasma sintering. Tribological properties were evaluated by ball-on-disk wear tests at room temperature (RT) and 200°C. WS2 not only presented the lowest COF (0.66) but also improved the wear resistance of aluminum by 54% at RT. Al-2 vol.% GNP composite displayed poor densification (91%) and low hardness resulting in poor wear resistance. The wear rate of Al-2 vol.% GNP composite increased by 233% at RT and 48% at 200°C as compared to pure aluminum. GNP addition also resulted in lower COF (0.79) as compared to pure aluminum (0.87).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA's behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks are emerging as effective tools in the gathering and dissemination of data. They can be applied in many fields including health, environmental monitoring, home automation and the military. Like all other computing systems it is necessary to include security features, so that security sensitive data traversing the network is protected. However, traditional security techniques cannot be applied to wireless sensor networks. This is due to the constraints of battery power, memory, and the computational capacities of the miniature wireless sensor nodes. Therefore, to address this need, it becomes necessary to develop new lightweight security protocols. This dissertation focuses on designing a suite of lightweight trust-based security mechanisms and a cooperation enforcement protocol for wireless sensor networks. This dissertation presents a trust-based cluster head election mechanism used to elect new cluster heads. This solution prevents a major security breach against the routing protocol, namely, the election of malicious or compromised cluster heads. This dissertation also describes a location-aware, trust-based, compromise node detection, and isolation mechanism. Both of these mechanisms rely on the ability of a node to monitor its neighbors. Using neighbor monitoring techniques, the nodes are able to determine their neighbors’ reputation and trust level through probabilistic modeling. The mechanisms were designed to mitigate internal attacks within wireless sensor networks. The feasibility of the approach is demonstrated through extensive simulations. The dissertation also addresses non-cooperation problems in multi-user wireless sensor networks. A scalable lightweight enforcement algorithm using evolutionary game theory is also designed. The effectiveness of this cooperation enforcement algorithm is validated through mathematical analysis and simulation. This research has advanced the knowledge of wireless sensor network security and cooperation by developing new techniques based on mathematical models. By doing this, we have enabled others to build on our work towards the creation of highly trusted wireless sensor networks. This would facilitate its full utilization in many fields ranging from civilian to military applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation studies the manipulation of particles using acoustic stimulation for applications in microfluidics and templating of devices. The term particle is used here to denote any solid, liquid or gaseous material that has properties, which are distinct from the fluid in which it is suspended. Manipulation means to take over the movements of the particles and to position them in specified locations. Using devices, microfabricated out of silicon, the behavior of particles under the acoustic stimulation was studied with the main purpose of aligning the particles at either low-pressure zones, known as the nodes or high-pressure zones, known as anti-nodes. By aligning particles at the nodes in a flow system, these particles can be focused at the center or walls of a microchannel in order to ultimately separate them. These separations are of high scientific importance, especially in the biomedical domain, since acoustopheresis provides a unique approach to separate based on density and compressibility, unparalleled by other techniques. The study of controlling and aligning the particles in various geometries and configurations was successfully achieved by controlling the acoustic waves. Apart from their use in flow systems, a stationary suspended-particle device was developed to provide controllable light transmittance based on acoustic stimuli. Using a glass compartment and a carbon-particle suspension in an organic solvent, the device responded to acoustic stimulation by aligning the particles. The alignment of light-absorbing carbon particles afforded an increase in visible light transmittance as high as 84.5%, and it was controlled by adjusting the frequency and amplitude of the acoustic wave. The device also demonstrated alignment memory rendering it energy-efficient. A similar device for suspended-particles in a monomer enabled the development of electrically conductive films. These films were based on networks of conductive particles. Elastomers doped with conductive metal particles were rendered surface conductive at particle loadings as low as 1% by weight using acoustic focusing. The resulting films were flexible and had transparencies exceeding 80% in the visible spectrum (400-800 nm) These films had electrical bulk conductivities exceeding 50 S/cm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and non-epileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection, exhibiting an accuracy of 95.90%, a sensitivity of 92.59%, and a specificity of 96.84%. The second study yielded that Hjorth’s parameter activity is sufficient to accurately relate EEG to epileptic and non-epileptic subjects. After testing, accuracy, sensitivity and specificity of the classifier were all above 0.9667. Statistical tests measured the superiority of activity at over 99.99 % certainty. It was demonstrated that 1) the spectral power in the gamma frequencies is highly effective in locating seizures from EEG and 2) activity can be used to link EEG recordings to epileptic and non-epileptic subjects. These two studies required high computational load and could be addressed thanks to NeuralStudio. From a medical perspective, both methods proved the merits of NeuralStudio in brain research applications. For its outstanding features, NeuralStudio has been recently awarded a patent (US patent No. 7502763).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the developments in computing and communication technologies, wireless sensor networks have become popular in wide range of application areas such as health, military, environment and habitant monitoring. Moreover, wireless acoustic sensor networks have been widely used for target tracking applications due to their passive nature, reliability and low cost. Traditionally, acoustic sensor arrays built in linear, circular or other regular shapes are used for tracking acoustic sources. The maintaining of relative geometry of the acoustic sensors in the array is vital for accurate target tracking, which greatly reduces the flexibility of the sensor network. To overcome this limitation, we propose using only a single acoustic sensor at each sensor node. This design greatly improves the flexibility of the sensor network and makes it possible to deploy the sensor network in remote or hostile regions through air-drop or other stealth approaches. Acoustic arrays are capable of performing the target localization or generating the bearing estimations on their own. However, with only a single acoustic sensor, the sensor nodes will not be able to generate such measurements. Thus, self-organization of sensor nodes into virtual arrays to perform the target localization is essential. We developed an energy-efficient and distributed self-organization algorithm for target tracking using wireless acoustic sensor networks. The major error sources of the localization process were studied, and an energy-aware node selection criterion was developed to minimize the target localization errors. Using this node selection criterion, the self-organization algorithm selects a near-optimal localization sensor group to minimize the target tracking errors. In addition, a message passing protocol was developed to implement the self-organization algorithm in a distributed manner. In order to achieve extended sensor network lifetime, energy conservation was incorporated into the self-organization algorithm by incorporating a sleep-wakeup management mechanism with a novel cross layer adaptive wakeup probability adjustment scheme. The simulation results confirm that the developed self-organization algorithm provides satisfactory target tracking performance. Moreover, the energy saving analysis confirms the effectiveness of the cross layer power management scheme in achieving extended sensor network lifetime without degrading the target tracking performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Access control (AC) is a necessary defense against a large variety of security attacks on the resources of distributed enterprise applications. However, to be effective, AC in some application domains has to be fine-grain, support the use of application-specific factors in authorization decisions, as well as consistently and reliably enforce organization-wide authorization policies across enterprise applications. Because the existing middleware technologies do not provide a complete solution, application developers resort to embedding AC functionality in application systems. This coupling of AC functionality with application logic causes significant problems including tremendously difficult, costly and error prone development, integration, and overall ownership of application software. The way AC for application systems is engineered needs to be changed. In this dissertation, we propose an architectural approach for engineering AC mechanisms to address the above problems. First, we develop a framework for implementing the role-based access control (RBAC) model using AC mechanisms provided by CORBA Security. For those application domains where the granularity of CORBA controls and the expressiveness of RBAC model suffice, our framework addresses the stated problem. In the second and main part of our approach, we propose an architecture for an authorization service, RAD, to address the problem of controlling access to distributed application resources, when the granularity and support for complex policies by middleware AC mechanisms are inadequate. Applying this architecture, we developed a CORBA-based application authorization service (CAAS). Using CAAS, we studied the main properties of the architecture and showed how they can be substantiated by employing CORBA and Java technologies. Our approach enables a wide-ranging solution for controlling the resources of distributed enterprise applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The enzyme S-adenosyl-L-homocysteine (AdoHey) hydrolase effects hydrolytic cleavage of AdoHcy to adenosine (Ado) and L-homocysteine (Hcy). The cellular levels of AdoHcy and Hcy are critical because AdoHcy is a potent feedback inhibitor of crucial transmethylation enzymes. Also, elevated plasma levels of Hcy in humans have been shown to be a risk factor in coronary artery disease. On the basis of the previous finding that AdoHcy hydrolase is able to add the enzyme-sequestered water molecule across the 5',6'-double bond of (halo or dihalohomovinyl)-adenosines causing covalent binding inhibition, we designed and synthesized AdoHcy analogues with the 5',6'-olefin motif incorporated in place of the carbon-5' and sulfur atoms. From the available synthetic methods we chose two independent approaches: the first approach was based on the construction of a new C5'- C6' double bond via metathesis reactions, and the second approach was based on the formation of a new C6'-C7' single bond via Pd-catalyzed cross-couplings. Cross-metathesis of the suitably protected 5'-deoxy-5'-methyleneadenosine with racemic 2-amino-5-hexenoate in the presence of Hoveyda-Grubb's catalyst followed by standard deprotection afforded the desired analogue as 5'E isomer of the inseparable mixture of 9'RIS diastereomers. Metathesis of chiral homoallylglycine [(2S)-amino-5-hexenoate] produced AdoHcy analogue with established stereochemistry E at C5'atom and S at C9' atom. The 5'-bromovinyl analogue was synthesized using the brominationdehydrobromination strategy with pyridinium tribromide and DBU. Since literature reports on the Pd-catalyzed monoalkylation of dihaloalkenes (Csp2-Csp3 coupling) were scarce, we were prompted to undertake model studies on Pdcatalyzed coupling between vinyl dihalides and alkyl organometallics. The 1-fluoro-1- haloalkenes were found to undergo Negishi couplings with alkylzinc bromides to give multisubstituted fluoroalkenes. The alkylation was trans-selective affording pure Zfluoroalkenes. The highest yields were obtained with PdCl 2(dppb) catalyst, but the best stereochemical outcome was obtained with less reactive Pd(PPh3)4 . Couplings of 1,1- dichloro-and 1,1-dibromoalkenes with organozinc reagents resulted in the formation of monocoupled 1-halovinyl product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Highway Safety Manual (HSM) estimates roadway safety performance based on predictive models that were calibrated using national data. Calibration factors are then used to adjust these predictive models to local conditions for local applications. The HSM recommends that local calibration factors be estimated using 30 to 50 randomly selected sites that experienced at least a total of 100 crashes per year. It also recommends that the factors be updated every two to three years, preferably on an annual basis. However, these recommendations are primarily based on expert opinions rather than data-driven research findings. Furthermore, most agencies do not have data for many of the input variables recommended in the HSM. This dissertation is aimed at determining the best way to meet three major data needs affecting the estimation of calibration factors: (1) the required minimum sample sizes for different roadway facilities, (2) the required frequency for calibration factor updates, and (3) the influential variables affecting calibration factors. In this dissertation, statewide segment and intersection data were first collected for most of the HSM recommended calibration variables using a Google Maps application. In addition, eight years (2005-2012) of traffic and crash data were retrieved from existing databases from the Florida Department of Transportation. With these data, the effect of sample size criterion on calibration factor estimates was first studied using a sensitivity analysis. The results showed that the minimum sample sizes not only vary across different roadway facilities, but they are also significantly higher than those recommended in the HSM. In addition, results from paired sample t-tests showed that calibration factors in Florida need to be updated annually. To identify influential variables affecting the calibration factors for roadway segments, the variables were prioritized by combining the results from three different methods: negative binomial regression, random forests, and boosted regression trees. Only a few variables were found to explain most of the variation in the crash data. Traffic volume was consistently found to be the most influential. In addition, roadside object density, major and minor commercial driveway densities, and minor residential driveway density were also identified as influential variables.