90 resultados para two-step chemical reaction model
Resumo:
Alloy nanoparticles (NPs) of gold and palladium on ZrO2 support (Au–Pd@ZrO2) were found to be highly active in oxidation of benzyl alcohols and can be used for the tandem synthesis of imines from benzyl alcohols and amines via a one-pot, two-step process at mild reaction conditions. The first step of the process is oxidation of benzyl alcohol to benzaldehyde, excellent yields were achieved after 7 h reaction at 40 °C without addition of any base. In the second step, aniline was introduced into the reaction system to produced N-benzylideneaniline. The benzaldehyde obtained in the first step was completely consumed within 1 h. A range of benzyl alcohols and amines were investigated for the general applicability of the Au–Pd alloy catalysts. It is found that the performance of the catalysts depends on the Au–Pd metal contents and composition. The optimal catalyst is 3.0 wt% Au–Pd@ZrO2 with a Au:Pd molar ratio 1:1. The alloy NP catalyst exhibited superior catalytic properties to pure AuNP or PdNP because the surface of alloy NPs has higher charge heterogeneity than that of pure metal NPs according to simulation of density function theory (DFT)
Resumo:
Bird species richness survey is one of the most intriguing ecological topics for evaluating environmental health. Here, bird species richness denotes the number of unique bird species in a particular area. Factors affecting the investigation of bird species richness include weather, observation bias, and most importantly, the prohibitive costs of conducting surveys at large spatiotemporal scales. Thanks to advances in recording techniques, these problems have been alleviated by deploying sensors for acoustic data collection. Although automated detection techniques have been introduced to identify various bird species, the innate complexity of bird vocalizations, the background noise present in the recording and the escalating volumes of acoustic data pose a challenging task on determination of bird species richness. In this paper we proposed a two-step computer-assisted sampling approach for determining bird species richness in one-day acoustic data. First, a classification model is built based on acoustic indices for filtering out minutes that contain few bird species. Then the classified bird minutes are ordered by an acoustic index and the redundant temporal minutes are removed from the ranked minute sequence. The experimental results show that our method is more efficient in directing experts for determination of bird species compared with the previous methods.
Resumo:
A flexible and simple Bayesian decision-theoretic design for dose-finding trials is proposed in this paper. In order to reduce the computational burden, we adopt a working model with conjugate priors, which is flexible to fit all monotonic dose-toxicity curves and produces analytic posterior distributions. We also discuss how to use a proper utility function to reflect the interest of the trial. Patients are allocated based on not only the utility function but also the chosen dose selection rule. The most popular dose selection rule is the one-step-look-ahead (OSLA), which selects the best-so-far dose. A more complicated rule, such as the two-step-look-ahead, is theoretically more efficient than the OSLA only when the required distributional assumptions are met, which is, however, often not the case in practice. We carried out extensive simulation studies to evaluate these two dose selection rules and found that OSLA was often more efficient than two-step-look-ahead under the proposed Bayesian structure. Moreover, our simulation results show that the proposed Bayesian method's performance is superior to several popular Bayesian methods and that the negative impact of prior misspecification can be managed in the design stage.
Resumo:
This work reports on the fabrication of a superhydrophobic nylon textile based on the organic charge transfer complex CuTCNAQ (TCNAQ = 11,11,12,12-tetracyanoanthraquinodimethane). The nylon fabric that is metallized with copper undergoes a spontaneous chemical reaction with TCNAQ dissolved in acetonitrile to form nanorods of CuTCNAQ that are intertwined over the entire surface of the fabric. This creates the necessary micro and nanoscale roughness that is required for the Cassie-Baxter state thereby achieving a superhydrophobic/superoleophilic surface without the need for a fluorinated surface. The material is characterised with SEM, FT-IR and XPS spectroscopy and investigated for its ability to separate oil and water in two modes, namely under gravity and as an absorbent. It is found that the fabric can separate dichloromethane, olive oil and crude oil from water and in fact reduce the water content of the oil during the separation process. The fabric is reusable and tolerant to conditions such as seawater, hydrochloric acid and extensive time periods on the shelf. Given that CuTCNAQ is a copper based semiconductor may also open up the possibility of other applications in areas such as photocatalysis and antibacterial applications.
Resumo:
The hype cycle model traces the evolution of technological innovations as they pass through successive stages pronounced by the peak, disappointment, and recovery of expectations. Since its introduction by Gartner nearly two decades ago, the model has received growing interest from practitioners, and more recently from scholars. Given the model's proclaimed capacity to forecast technological development, an important consideration for organizations in formulating marketing strategies, this paper provides a critical review of the hype cycle model by seeking evidence from Gartner's own technology databases for the manifestation of hype cycles. The results of our empirical work show incongruences connected with the reports of Gartner, which motivates us to consider possible future directions, whereby the notion of hype or hyped dynamics (though not necessarily the hype cycle model itself) can be captured in existing life cycle models through the identification of peak, disappointment, and recovery patterns.
Resumo:
This paper presents a prototype tracking system for tracking people in enclosed indoor environments where there is a high rate of occlusions. The system uses a stereo camera for acquisition, and is capable of disambiguating occlusions using a combination of depth map analysis, a two step ellipse fitting people detection process, the use of motion models and Kalman filters and a novel fit metric, based on computationally simple object statistics. Testing shows that our fit metric outperforms commonly used position based metrics and histogram based metrics, resulting in more accurate tracking of people.
Resumo:
This is the final report of research project 2002-057-C: Enabling Team Collaboration with Pervasive and Mobile Computing. The research project was carried out by the Australian Cooperative Research Centre for Construction Innovation and has two streams that consider the use of pervasive computing technologies in two different contexts. The first context was the on-site deployment of mobile computing devices, where as the second context was the use and development of intelligent rooms based on sensed environments and new human-computer interfaces (HCI) for collaboration in the design office. The two streams present a model of team collaboration that relies on continues communication to people and information to reduce information leakage. This report consists of five sections: (1) Introduction; (2) Research Project Background; (3) Project Implementation; (4) Case Studies and Outcomes; and (5) Conclusion and Recommendation. Introduction in Section 1 presents a brief description of the research project including general research objectives and structure. Section 2 introduces the background of the research and detailed information regarding project participants, objectives and significance, and also research methodology. Review of all research activities such as literature review and case studies are summarised in Project Implementation in Section 3. Following this, in Section 4 the report then focuses on analysing the case studies and presents their outcomes. Conclusion and recommendation of the research project are summarised in Section 5. Other information to support the content of the report such as research project schedule is provided in Appendices. The purpose of the final project report is to provide industry partners with detailed information on the project activities and methodology such as the implementation of pervasive computing technologies in the real contexts. The report summarises the outcomes of the case studies and provides necessary recommendation to industry partners of using new technologies to support better project collaboration.
Resumo:
To date, automatic recognition of semantic information such as salient objects and mid-level concepts from images is a challenging task. Since real-world objects tend to exist in a context within their environment, the computer vision researchers have increasingly incorporated contextual information for improving object recognition. In this paper, we present a method to build a visual contextual ontology from salient objects descriptions for image annotation. The ontologies include not only partOf/kindOf relations, but also spatial and co-occurrence relations. A two-step image annotation algorithm is also proposed based on ontology relations and probabilistic inference. Different from most of the existing work, we specially exploit how to combine representation of ontology, contextual knowledge and probabilistic inference. The experiments show that image annotation results are improved in the LabelMe dataset.
Resumo:
Background: Blood for transfusion may become contaminated at any point between collection and transfusion and may result in bacteraemia (the presence of bacteria in the blood),severe illness or even death for the blood recipient. Donor arm skin is one potential source of blood contamination, so it is usual to cleanse the skin with an antiseptic before blood donation. One-step and two-step alcohol based antiseptic regimens are both commonly advocated but there is uncertainty as to which is most effective.----- Objectives: To assess the effects of cleansing the skin of blood donors with alcohol in a one-step compared with alcohol in a two-step procedure to prevent contamination of collected blood or bacteraemia in the recipient.----- Search strategy: We searched the Cochrane Wounds Group Specialised Register (March 10 2009); The Cochrane Central Register of Controlled Trials(CENTRAL) The Cochrane Library 2009, Issue 1; Ovid MEDLINE - (1950 to February Week 4 2009); Ovid EMBASE - (1980 to 2009 Week 9); and EBSCO CINAHL - (1982 to February Week 4 2009). We also searched the reference lists of key papers.----- Selection criteria: All randomised trials (RCTs) comparing alcohol based donor skin cleansing in a one-step versus a two-step process that includes alcohol and any other antiseptic for pre-venepuncture skin cleansing were considered. Quasi randomised trials were to have been considered in the absence of RCTs.----- Data collection and analysis: Two review authors independently assessed studies for inclusion.----- Main results: No studies (RCTs or quasi RCTs) met the inclusion criteria. Authors’ conclusions We did not identify any eligible studies for inclusion in this review. It is therefore unclear whether a two-step, alcohol followed by antiseptic skin cleansing process prior to blood donation confers any reduction in the risk of blood contamination or bacteraemia in blood recipients, or conversely whether a one-step process increases risk above that associated with a two-step process.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. AE is potentially more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message and in a separate pass, providing integrity protection by generating a Message Authentication Code (MAC) tag. This paper presents results on the analysis of three AE stream ciphers submitted to the recently completed eSTREAM competition. We classify the ciphers based on the methods the ciphers use to provide authenticated encryption and discuss possible methods for mounting attacks on these ciphers.
Resumo:
Background: There are innumerable diabetes studies that have investigated associations between risk factors, protective factors, and health outcomes; however, these individual predictors are part of a complex network of interacting forces. Moreover, there is little awareness about resilience or its importance in chronic disease in adulthood, especially diabetes. Thus, this is the first study to: (1) extensively investigate the relationships among a host of predictors and multiple adaptive outcomes; and (2) conceptualise a resilience model among people with diabetes. Methods: This cross-sectional study was divided into two research studies. Study One was to translate two diabetes-specific instruments (Problem Areas In Diabetes, PAID; Diabetes Coping Measure, DCM) into a Chinese version and to examine their psychometric properties for use in Study Two in a convenience sample of 205 outpatients with type 2 diabetes. In Study Two, an integrated theoretical model is developed and evaluated using the structural equation modelling (SEM) technique. A self-administered questionnaire was completed by 345 people with type 2 diabetes from the endocrine outpatient departments of three hospitals in Taiwan. Results: Confirmatory factor analyses confirmed a one-factor structure of the PAID-C which was similar to the original version of the PAID. Strong content validity of the PAID-C was demonstrated. The PAID-C was associated with HbA1c and diabetes self-care behaviours, confirming satisfactory criterion validity. There was a moderate relationship between the PAID-C and the Perceived Stress Scale, supporting satisfactory convergent validity. The PAID-C also demonstrated satisfactory stability and high internal consistency. A four-factor structure and strong content validity of the DCM-C was confirmed. Criterion validity demonstrated that the DCM-C was significantly associated with HbA1c and diabetes self-care behaviours. There was a statistical correlation between the DCM-C and the Revised Ways of Coping Checklist, suggesting satisfactory convergent validity. Test-retest reliability demonstrated satisfactory stability of the DCM-C. The total scale of the DCM-C showed adequate internal consistency. Age, duration of diabetes, diabetes symptoms, diabetes distress, physical activity, coping strategies, and social support were the most consistent factors associated with adaptive outcomes in adults with diabetes. Resilience was positively associated with coping strategies, social support, health-related quality of life, and diabetes self-care behaviours. Results of the structural equation modelling revealed protective factors had a significant direct effect on adaptive outcomes; however, the construct of risk factors was not significantly related to adaptive outcomes. Moreover, resilience can moderate the relationships among protective factors and adaptive outcomes, but there were no interaction effects of risk factors and resilience on adaptive outcomes. Conclusion: This study contributes to an understanding of how risk factors and protective factors work together to influence adaptive outcomes in blood sugar control, health-related quality of life, and diabetes self-care behaviours. Additionally, resilience is a positive personality characteristic and may be importantly involved in the adjustment process among people living with type 2 diabetes.
Resumo:
In this paper, we present a ∑GIi/D/1/∞ queue with heterogeneous input/output slot times. This queueing model can be regarded as an extension of the ordinary GI/D/1/∞ model. For this ∑GIi/D/1/∞ queue, we assume that several input streams arrive at the system according to different slot times. In other words, there are different slot times for different input/output processes in the queueing model. The queueing model can therefore be used for an ATM multiplexer with heterogeneous input/output link capacities. Several cases of the queueing model are discussed to reflect different relationships among the input/output link capacities of an ATM multiplexer. In the queueing analysis, two approaches: the Markov model and the probability generating function technique, are adopted to develop the queue length distributions observed at different epochs. This model is particularly useful in the performance analysis of ATM multiplexers with heterogeneous input/output link capacities.
Resumo:
Purpose: To date, there have been no measuring techniques available that could clearly identify all phases of tear film surface kinetics in one interblink interval. ----- ----- Methods: Using a series of cases, we show that lateral shearing interferometry equipped with a set of robust parameter estimation techniques is able to characterize up to five different phases of tear film surface kinetics that include: (i) initial fast tear film build-up phase, (ii) further slower tear film build-up phase, (iii) tear film stability, (iv) tear film thinning, and (v), after a detected break-up, subsequent tear film deterioration. ----- ----- Results: Several representative examples are given for estimating tear film surface kinetics in measurements in which the subjects were asked to blink and keep their eyes open as long as they could. ----- ----- Conclusions: Lateral shearing interferometry is a noninvasive technique that provides means for temporal characterization of tear film surface kinetics and the opportunity for the analysis of the two-step tear film build-up process.