954 resultados para continuous biometric authentication system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A continuous multi-step synthesis of 1,2-diphenylethane was performed sequentially in a structured compact reactor. This process involved a Heck C-C coupling reaction followed by the addition of hydrogen to perform reduction of the intermediate obtained in the first step. Both of the reactions were catalysed by microspherical carbon-supported Pd catalysts. Due to the integration of the micro-heat exchanger, the static mixer and the mesoscale packed-bed reaction channel, the compact reactor was proven to be an intensified tool for promoting the reactions. In comparison with the batch reactor, this flow process in the compact reactor was more efficient as: (i) the reaction time was significantly reduced (ca. 7 min versus several hours), (ii) no additional ligands were used and (iii) the reaction was run at lower operational pressure and temperature. Pd leached in the Heck reaction step was shown to be effectively recovered in the following hydrogenation reaction section and the catalytic activity of the system can be mostly retained by reverse flow operation. © 2009 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For remote, semi-arid areas, brackish groundwater (BW) desalination powered by solar energy may serve as the most technically and economically viable means to alleviate the water stresses. For such systems, high recovery ratio is desired because of the technical and economical difficulties of concentrate management. It has been demonstrated that the current, conventional solar reverse osmosis (RO) desalination can be improved by 40–200 times by eliminating unnecessary energy losses. In this work, a batch-RO system that can be powered by a thermal Rankine cycle has been developed. By directly recycling high pressure concentrates and by using a linkage connection to provide increasing feed pressures, the batch-RO has been shown to achieve a 70% saving in energy consumption compared to a continuous single-stage RO system. Theoretical investigations on the mass transfer phenomena, including dispersion and concentration polarization, have been carried out to complement and to guide experimental efforts. The performance evaluation of the batch-RO system, named DesaLink, has been based on extensive experimental tests performed upon it. Operating DesaLink using compressed air as power supply under laboratory conditions, a freshwater production of approximately 300 litres per day was recorded with a concentration of around 350 ppm, whilst the feed water had a concentration range of 2500–4500 ppm; the corresponding linkage efficiency was around 40%. In the computational aspect, simulation models have been developed and validated for each of the subsystems of DesaLink, upon which an integrated model has been realised for the whole system. The models, both the subsystem ones and the integrated one, have been demonstrated to predict accurately the system performance under specific operational conditions. A simulation case study has been performed using the developed model. Simulation results indicate that the system can be expected to achieve a water production of 200 m3 per year by using a widely available evacuated tube solar collector having an area of only 2 m2. This freshwater production would satisfy the drinking water needs of 163 habitants in the Rajasthan region, the area for which the case study was performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT One of the current research trends in Enterprise Resource Planning (ERP) involves examining the critical factors for its successful implementation. However, such research is limited to system implementation, not focusing on the flexibility of ERP to respond to changes in business. Therefore, this study explores a combination system, made up of an ERP and informality, intended to provide organisations with efficient and flexible performance simultaneously. In addition, this research analyses the benefits and challenges of using the system. The research was based on socio-technical system (STS) theory which contains two dimensions: 1) a technical dimension which evaluates the performance of the system; and 2) a social dimension which examines the impact of the system on an organisation. A mixed method approach has been followed in this research. The qualitative part aims to understand the constraints of using a single ERP system, and to define a new system corresponding to these problems. To achieve this goal, four Chinese companies operating in different industries were studied, all of which faced challenges in using an ERP system due to complexity and uncertainty in their business environments. The quantitative part contains a discrete-event simulation study that is intended to examine the impact of operational performance when a company implements the hybrid system in a real-life situation. Moreover, this research conducts a further qualitative case study, the better to understand the influence of the system in an organisation. The empirical aspect of the study reveals that an ERP with pre-determined business activities cannot react promptly to unanticipated changes in a business. Incorporating informality into an ERP can react to different situations by using different procedures that are based on the practical knowledge of frontline employees. Furthermore, the simulation study shows that the combination system can achieve a balance between efficiency and flexibility. Unlike existing research, which emphasises a continuous improvement in the IT functions of an enterprise system, this research contributes to providing a definition of a new system in theory, which has mixed performance and contains both the formal practices embedded in an ERP and informal activities based on human knowledge. It supports both cost-efficiency in executing business transactions and flexibility in coping with business uncertainty.This research also indicates risks of using the system, such as using an ERP with limited functions; a high cost for performing informally; and a low system acceptance, owing to a shift in organisational culture. With respect to practical contribution, this research suggests that companies can choose the most suitable enterprise system approach in accordance with their operational strategies. The combination system can be implemented in a company that needs to operate a medium amount of volume and variety. By contrast, the traditional ERP system is better suited in a company that operates a high-level volume market, while an informal system is more suitable for a firm with a requirement for a high level of variety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to vigorous globalisation and product proliferation in recent years, more waste has been produced by the soaring manufacturing activities. This has contributed to the significant need for an efficient waste management system to ensure, with all efforts, the waste is properly treated for recycling or disposed. This paper presents a Decision Support System (DSS) framework, based on Constraint Logic Programming (CLP), for the collection management of industrial waste (of all kinds) and discusses the potential employment of Radio-Frequency Identification Technology (RFID) to improve several critical procedures involved in managing waste collection. This paper also demonstrates a widely distributed and semi-structured network of waste producing enterprises (e.g. manufacturers) and waste processing enterprises (i.e. waste recycling/treatment stations) improving their operations planning by means of using the proposed DSS. The potential RFID applications to update and validate information in a continuous manner to bring value-added benefits to the waste collection business are also presented. © 2012 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonlinearity management is explored as a complete tool to obtain maximum transmission reach in a WDM fiber transmission system, making it possible to optimize multiple system parameters, including optimal dispersion pre-compensation, with fast simulations based on the continuous-wave approximation. © 2006 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of complex networks has recently attracted increasing interest because of the large variety of systems that can be modeled using graphs. A fundamental operation in the analysis of complex networks is that of measuring the centrality of a vertex. In this paper, we propose to measure vertex centrality using a continuous-time quantum walk. More specifically, we relate the importance of a vertex to the influence that its initial phase has on the interference patterns that emerge during the quantum walk evolution. To this end, we make use of the quantum Jensen-Shannon divergence between two suitably defined quantum states. We investigate how the importance varies as we change the initial state of the walk and the Hamiltonian of the system. We find that, for a suitable combination of the two, the importance of a vertex is almost linearly correlated with its degree. Finally, we evaluate the proposed measure on two commonly used networks. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PurposeTo develop and validate a classification system for focal vitreomacular traction (VMT) with and without macular hole based on spectral domain optical coherence tomography (SD-OCT), intended to aid in decision-making and prognostication.MethodsA panel of retinal specialists convened to develop this system. A literature review followed by discussion on a wide range of cases formed the basis for the proposed classification. Key features on OCT were identified and analysed for their utility in clinical practice. A final classification was devised based on two sequential, independent validation exercises to improve interobserver variability.ResultsThis classification tool pertains to idiopathic focal VMT assessed by a horizontal line scan using SD-OCT. The system uses width (W), interface features (I), foveal shape (S), retinal pigment epithelial changes (P), elevation of vitreous attachment (E), and inner and outer retinal changes (R) to give the acronym WISPERR. Each category is scored hierarchically. Results from the second independent validation exercise indicated a high level of agreement between graders: intraclass correlation ranged from 0.84 to 0.99 for continuous variables and Fleiss' kappa values ranged from 0.76 to 0.95 for categorical variables.ConclusionsWe present an OCT-based classification system for focal VMT that allows anatomical detail to be scrutinised and scored qualitatively and quantitatively using a simple, pragmatic algorithm, which may be of value in clinical practice as well as in future research studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this project is to design a new desalination system with energy efficiency approaching the theoretical thermodynamic limit—even at high recovery ratio. The system uses reverse osmosis (RO) and a batch principle of operation to overcome the problem of concentration factor which prevents continuous-flow RO systems from ever reaching this limit and thus achieving the minimum possible specific energy consumption, SEC. Batch operation comprises a cycle in three phases: pressurisation, purge, and refill. Energy recovery is inherent to the design. Unlike in closed-circuit desalination (CCD), no feedwater is added to the pressure circuit during the pressurisation phase. The batch configuration is compared to standard configurations such as continuous single-stage RO (with energy recovery) and CCD. Theoretical analysis has shown that the new system is able to use 33% less energy than CCD at a recovery ratio of 80%. A prototype has been constructed using readily available parts and tested with feedwater salinities and recovery ratios ranging from 2,000 to 5,000 ppm and 17.2–70.6%, respectively. Results compare very well against the standard configurations. For example, with feedwater containing 5,000 ppm NaCl and recovery ratio of 69%, a hydraulic SEC of 0.31 kWh/m3 was obtained—better than the minimum theoretically possible with a single-stage continuous flow system with energy recovery device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mediation techniques provide interoperability and support integrated query processing among heterogeneous databases. While such techniques help data sharing among different sources, they increase the risk for data security, such as violating access control rules. Successful protection of information by an effective access control mechanism is a basic requirement for interoperation among heterogeneous data sources. ^ This dissertation first identified the challenges in the mediation system in order to achieve both interoperability and security in the interconnected and collaborative computing environment, which includes: (1) context-awareness, (2) semantic heterogeneity, and (3) multiple security policy specification. Currently few existing approaches address all three security challenges in mediation system. This dissertation provides a modeling and architectural solution to the problem of mediation security that addresses the aforementioned security challenges. A context-aware flexible authorization framework was developed in the dissertation to deal with security challenges faced by mediation system. The authorization framework consists of two major tasks, specifying security policies and enforcing security policies. Firstly, the security policy specification provides a generic and extensible method to model the security policies with respect to the challenges posed by the mediation system. The security policies in this study are specified by 5-tuples followed by a series of authorization constraints, which are identified based on the relationship of the different security components in the mediation system. Two essential features of mediation systems, i. e., relationship among authorization components and interoperability among heterogeneous data sources, are the focus of this investigation. Secondly, this dissertation supports effective access control on mediation systems while providing uniform access for heterogeneous data sources. The dynamic security constraints are handled in the authorization phase instead of the authentication phase, thus the maintenance cost of security specification can be reduced compared with related solutions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of movements and habitat use is necessary to assess a species’ ecological role and is especially important for mesopredators because they provide the link between upper and lower trophic levels. Using acoustic telemetry, we examined coarse-scale diel and seasonal movements of elasmobranch mesopredators on a shallow sandflat in Shark Bay, Western Australia. Giant shovelnose rays (Glaucostegus typus) and reticulate whiprays (Himantura uarnak) were most often detected in nearshore microhabitats and were regularly detected throughout the day and year, although reticulate whiprays tended to frequent the monitored array over longer periods. Pink whiprays (H. fai) and cowtail stingrays (Pastinachus atrus) were also detected throughout the day, but were far less frequently detected. Overall, there was no apparent spatial or temporal partitioning of the sandflats, but residency to the area varied between species. In addition, ray presence throughout the year suggests that previously observed differences in seasonal abundance are likely because of seasonal changes in habitat use rather than large-scale migrations. Continuous use of the sandflats and limited movements within this ray community suggests that rays have the potential to be a structuring force on this system and that focusing on nearshore habitats is important for managing subtropical ray populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry's standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device.^ This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. ^ In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. ^ The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This data set was obtained during the R. V. POLARSTERN cruise ANT-XXVIII/3. Current velocities were measured nearly continuously when outside territorial waters along the ship's track with a vessel-mounted TRD Instruments' 153.6-kHz Ocean Surveyor ADCP. The transducers were located 11 m below the water line and were protected against ice floes by an acoustically transparent plastic window. The current measurements were made using a pulse of 2s and vertical bin length of 4 m. The ship's velocity was calculated from position fixes obtained by the Global Positioning System (GPS). Heading, roll and pitch data from the ship's gyro platforms and the navigation data were used to convert the ADCP velocities into earth coordinates. Accuracy of the ADCP velocities mainly depends on the quality of the position fixes and the ship's heading data. Further errors stem from a misalignment of the transducer with the ship's centerline. The ADCP data were processed using the Ocean Surveyor Sputum Interpreter (OSSI) software developed by GEOMAR Helmholtz-Zentrum für Ozeanforschung Kiel. The averaging interval was set to 120 seconds. The reference layer was set to bins 5 to 16 avoiding near surface effects and biases near bin 1. Sampling interval setting: 2s; Number of bins: 80; Bin length: 4m; Pulse length: 4m; Blank beyond transmit length: 4m. Data processing setting: Top reference bin: 5; Bottom reference bin: 16; Average: 120s; Misalignment amplitude: 1.0276 +/- 0.1611, phase: 0.8100 +/- 0.7190. The precision for single ping and 4m cell size reported by TRDI is 0.30m/s. Resulting from the single ping precision and the number of pings (most of the time 36) during 120seconds the velocity accuracy is nearly 0.05m/s. (Velocity accuracy = single ping precision divided by square root of the number of pings).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are authentication models which use passwords, keys, personal identifiers (cards, tags etc) to authenticate a particular user in the authentication/identification process. However, there are other systems that can use biometric data, such as signature, fingerprint, voice, etc., to authenticate an individual in a system. In another hand, the storage of biometric can bring some risks such as consistency and protection problems for these data. According to this problem, it is necessary to protect these biometric databases to ensure the integrity and reliability of the system. In this case, there are models for security/authentication biometric identification, for example, models and Fuzzy Vault and Fuzzy Commitment systems. Currently, these models are mostly used in the cases for protection of biometric data, but they have fragile elements in the protection process. Therefore, increasing the level of security of these methods through changes in the structure, or even by inserting new layers of protection is one of the goals of this thesis. In other words, this work proposes the simultaneous use of encryption (Encryption Algorithm Papilio) with protection models templates (Fuzzy Vault and Fuzzy Commitment) in identification systems based on biometric. The objective of this work is to improve two aspects in Biometric systems: safety and accuracy. Furthermore, it is necessary to maintain a reasonable level of efficiency of this data through the use of more elaborate classification structures, known as committees. Therefore, we intend to propose a model of a safer biometric identification systems for identification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with a very important issue in any knowledge engineering discipline: the accurate representation and modelling of real life data and its processing by human experts. The work is applied to the GRiST Mental Health Risk Screening Tool for assessing risks associated with mental-health problems. The complexity of risk data and the wide variations in clinicians' expert opinions make it difficult to elicit representations of uncertainty that are an accurate and meaningful consensus. It requires integrating each expert's estimation of a continuous distribution of uncertainty across a range of values. This paper describes an algorithm that generates a consensual distribution at the same time as measuring the consistency of inputs. Hence it provides a measure of the confidence in the particular data item's risk contribution at the input stage and can help give an indication of the quality of subsequent risk predictions. © 2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effects of CO2 concentration on elemental composition of the coccolithophore Emiliania huxleyi were studied in phosphorus-limited, continuous cultures that were acclimated to experimental conditions for 30 d prior to the first sampling. We determined phytoplankton and bacterial cell numbers, nutrients, particulate components like organic carbon (POC), inorganic carbon (PIC), nitrogen (PN), organic phosphorus (POP), transparent exopolymer particles (TEP), as well as dissolved organic carbon (DOC) and nitrogen (DON), in addition to carbonate system parameters at CO2 levels of 180, 380 and 750 µatm. No significant difference between treatments was observed for any of the measured variables during repeated sampling over a 14 d period. We considered several factors that might lead to these results, i.e. light, nutrients, carbon overconsumption and transient versus steady-state growth. We suggest that the absence of a clear CO2 effect during this study does not necessarily imply the absence of an effect in nature. Instead, the sensitivity of the cell towards environmental stressors such as CO2 may vary depending on whether growth conditions are transient or sufficiently stable to allow for optimal allocation of energy and resources. We tested this idea on previously published data sets where PIC and POC divided by the corresponding cell abundance of E. huxleyi at various pCO2 levels and growth rates were available.