85 resultados para Continuous stirred reactor


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Endocrinopathic laminitis is frequently associated with hyperinsulinaemia but the role of glucose in the pathogenesis of the disease has not been fully investigated. This study aimed to determine the endogenous insulin response to a quantity of glucose equivalent to that administered during a laminitis-inducing, euglycaemic, hyperinsulinaemic clamp, over 48. h in insulin-sensitive Standardbred racehorses. In addition, the study investigated whether glucose infusion, in the absence of exogenous insulin administration, would result in the development of clinical and histopathological evidence of laminitis. Glucose (50% dextrose) was infused intravenously at a rate of 0.68 mL/kg/h for 48. h in treated horses (n = 4) and control horses (n = 3) received a balanced electrolyte solution (0.68 mL/kg/h). Lamellar histology was examined at the conclusion of the experiment. Horses in the treatment group were insulin sensitive (M value 0.039 ± 0.0012. mmol/kg/min and M-to-I ratio (100×) 0.014 ± 0.002) as determined by an approximated hyperglycaemic clamp. Treated horses developed glycosuria, hyperglycaemia (10.7 ± 0.78. mmol/L) and hyperinsulinaemia (208 ± 26.1. μIU/mL), whereas control horses did not. None of the horses became lame as a consequence of the experiment but all of the treated horses developed histopathological evidence of laminitis in at least one foot. Combined with earlier studies, the results showed that laminitis may be induced by either insulin alone or a combination of insulin and glucose, but that it is unlikely to be due to a glucose overload mechanism. Based on the histopathological data, the potential threshold for insulin toxicity (i.e. laminitis) in horses may be at or below a serum concentration of ∼200. μIU/mL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Appearance-based localization can provide loop closure detection at vast scales regardless of accumulated metric error. However, the computation time and memory requirements of current appearance-based methods scale not only with the size of the environment but also with the operation time of the platform. Additionally, repeated visits to locations will develop multiple competing representations, which will reduce recall performance over time. These properties impose severe restrictions on long-term autonomy for mobile robots, as loop closure performance will inevitably degrade with increased operation time. In this paper we present a graphical extension to CAT-SLAM, a particle filter-based algorithm for appearance-based localization and mapping, to provide constant computation and memory requirements over time and minimal degradation of recall performance during repeated visits to locations. We demonstrate loop closure detection in a large urban environment with capped computation time and memory requirements and performance exceeding previous appearance-based methods by a factor of 2. We discuss the limitations of the algorithm with respect to environment size, appearance change over time and applications in topological planning and navigation for long-term robot operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background When large scale trials are investigating the effects of interventions on appetite, it is paramount to efficiently monitor large amounts of human data. The original hand-held Electronic Appetite Ratings System (EARS) was designed to facilitate the administering and data management of visual analogue scales (VAS) of subjective appetite sensations. The purpose of this study was to validate a novel hand-held method (EARS II (HP® iPAQ)) against the standard Pen and Paper (P&P) method and the previously validated EARS. Methods Twelve participants (5 male, 7 female, aged 18-40) were involved in a fully repeated measures design. Participants were randomly assigned in a crossover design, to either high fat (>48% fat) or low fat (<28% fat) meal days, one week apart and completed ratings using the three data capture methods ordered according to Latin Square. The first set of appetite sensations was completed in a fasted state, immediately before a fixed breakfast. Thereafter, appetite sensations were completed every thirty minutes for 4h. An ad libitum lunch was provided immediately before completing a final set of appetite sensations. Results Repeated measures ANOVAs were conducted for ratings of hunger, fullness and desire to eat. There were no significant differences between P&P compared with either EARS or EARS II (p > 0.05). Correlation coefficients between P&P and EARS II, controlling for age and gender, were performed on Area Under the Curve ratings. R2 for Hunger (0.89), Fullness (0.96) and Desire to Eat (0.95) were statistically significant (p < 0.05). Conclusions EARS II was sensitive to the impact of a meal and recovery of appetite during the postprandial period and is therefore an effective device for monitoring appetite sensations. This study provides evidence and support for further validation of the novel EARS II method for monitoring appetite sensations during large scale studies. The added versatility means that future uses of the system provides the potential to monitor a range of other behavioural and physiological measures often important in clinical and free living trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo approach to Bayesian sequential design for the incorporation of model uncertainty. The methodology is demonstrated through the development and implementation of two model discrimination utilities; mutual information and total separation, but it can also be applied more generally if one has different experimental aims. A sequential Monte Carlo algorithm is run for each rival model (in parallel), and provides a convenient estimate of the marginal likelihood (of each model) given the data, which can be used for model comparison and in the evaluation of utility functions. A major benefit of this approach is that it requires very little problem specific tuning and is also computationally efficient when compared to full Markov chain Monte Carlo approaches. This research is motivated by applications in drug development and chemical engineering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian Securities Exchange (ASX) listing rule 3.1 requires listed companies to immediately disclose price sensitive information to the market via the ASX’s Company Announcements Platform (CAP) prior to release through other disclosure channels. Since 1999, to improve the communication process, the ASX has permitted third-party mediation in the disclosure process that leads to the release of an Open Briefing (OB) through CAP. An OB is an interview between senior executives of the firm and an Open Briefing analyst employed by Orient Capital Pty Ltd (broaching topics such as current profit and outlook). Motivated by an absence of research on factors that influence firms to use OBs as a discretionary disclosure channel, this study examines (1) Why do firms choose to release information to the market via OBs?, (2) What are the firm characteristics that explain the discretionary use of OBs as a disclosure channel?, and (3) What are the disclosure attributes that influence firms’ decisions to regularly use OBs as a disclosure channel? Based on agency and information economics theories, a theoretical framework is developed to address research questions. This theoretical framework comprises disclosure environments such as firm characteristics and external factors, disclosure attributes and disclosure consequences. In order to address the first research question, the study investigates (i) the purpose of using OBs, (2) whether firms use OBs to provide information relating to previous public announcements, and (3) whether firms use OBs to provide routine or non-routine disclosures. In relation to the second and third research questions, hypotheses are developed to test factors expected to explain the discretionary use of OBs and firms’ decisions to regularly use OBs, and to explore the factors influencing the nature of OB disclosure. Content analysis and logistic regression models are used to investigate the research questions and test the hypotheses. Data are drawn from a hand-collected population of 1863 OB announcements issued by 239 listed firms between 2000 and 2010. The results show that types of information disclosed via an OB announcement are principally on matters relating to corporate strategies and performance and outlook. Most OB announcements are linked with a previous related announcement, with the lag between announcements significantly longer for loss-making firms than profitmaking firms. The main results show that firms which tend to be larger, have an analyst following, and have higher growth opportunities, are more likely to release OBs. Further, older firms and firms that release OB announcements containing good news, historical information and less complex information tend to be regular OB users. Lastly, firms more likely to disclose strategic information via OBs tend to operate in industries facing greater uncertainty, do not have analysts following, and have higher growth opportunities are less likely to disclose good news, historical information and complex information via OBs. This study is expected to contribute to disclosure literature in terms of disclosure attributes and firm characteristics that influence behaviour in this unique (OB) disclosure channel. With regard to practical significance, regulators can gain an understanding of how OBs are disclosed which can assist them in monitoring the use of OBs and improving the effectiveness of communications with stakeholders. In addition, investors can have a better comprehension of information contained in OB announcements, which may in turn better facilitate their investment decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most current computer systems authorise the user at the start of a session and do not detect whether the current user is still the initial authorised user, a substitute user, or an intruder pretending to be a valid user. Therefore, a system that continuously checks the identity of the user throughout the session is necessary without being intrusive to end-user and/or effectively doing this. Such a system is called a continuous authentication system (CAS). Researchers have applied several approaches for CAS and most of these techniques are based on biometrics. These continuous biometric authentication systems (CBAS) are supplied by user traits and characteristics. One of the main types of biometric is keystroke dynamics which has been widely tried and accepted for providing continuous user authentication. Keystroke dynamics is appealing for many reasons. First, it is less obtrusive, since users will be typing on the computer keyboard anyway. Second, it does not require extra hardware. Finally, keystroke dynamics will be available after the authentication step at the start of the computer session. Currently, there is insufficient research in the CBAS with keystroke dynamics field. To date, most of the existing schemes ignore the continuous authentication scenarios which might affect their practicality in different real world applications. Also, the contemporary CBAS with keystroke dynamics approaches use characters sequences as features that are representative of user typing behavior but their selected features criteria do not guarantee features with strong statistical significance which may cause less accurate statistical user-representation. Furthermore, their selected features do not inherently incorporate user typing behavior. Finally, the existing CBAS that are based on keystroke dynamics are typically dependent on pre-defined user-typing models for continuous authentication. This dependency restricts the systems to authenticate only known users whose typing samples are modelled. This research addresses the previous limitations associated with the existing CBAS schemes by developing a generic model to better identify and understand the characteristics and requirements of each type of CBAS and continuous authentication scenario. Also, the research proposes four statistical-based feature selection techniques that have highest statistical significance and encompasses different user typing behaviors which represent user typing patterns effectively. Finally, the research proposes the user-independent threshold approach that is able to authenticate a user accurately without needing any predefined user typing model a-priori. Also, we enhance the technique to detect the impostor or intruder who may take over during the entire computer session.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Insect baculovirus-produced Human immunodeficiency virus type 1 (HIV-1) Gag virus-like-particles (VLPs) stimulate good humoral and cell-mediated immune responses in animals and are thought to be suitable as a vaccine candidate. Drawbacks to this production system include contamination of VLP preparations with baculovirus and the necessity for routine maintenance of infectious baculovirus stock. We used piggyBac transposition as a novel method to create transgenic insect cell lines for continuous VLP production as an alternative to the baculovirus system. Results Transgenic cell lines maintained stable gag transgene integration and expression up to 100 cell passages, and although the level of VLPs produced was low compared to baculovirus-produced VLPs, they appeared similar in size and morphology to baculovirus-expressed VLPs. In a murine immunogenicity study, whereas baculovirus-produced VLPs elicited good CD4 immune responses in mice when used to boost a prime with a DNA vaccine, no boost response was elicited by transgenically produced VLPs. Conclusion Transgenic insect cells are stable and can produce HIV Pr55 Gag VLPs for over 100 passages: this novel result may simplify strategies aimed at making protein subunit vaccines for HIV. Immunogenicity of the Gag VLPs in mice was less than that of baculovirus-produced VLPs, which may be due to lack of baculovirus glycoprotein incorporation in the transgenic cell VLPs. Improved yield and immunogenicity of transgenic cell-produced VLPs may be achieved with the addition of further genetic elements into the piggyBac integron.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The residence time distribution (RTD) is a crucial parameter when treating engine exhaust emissions with a Dielectric Barrier Discharge (DBD) reactor. In this paper, the residence time of such a reactor is investigated using a finite element based software: COMSOL Multiphysics 4.3. Non-thermal plasma (NTP) discharge is being introduced as a promising method for pollutant emission reduction. DBD is one of the most advantageous of NTP technologies. In a two cylinder co-axial DBD reactor, tubes are placed between two electrodes and flow passes through the annuals between these barrier tubes. If the mean residence time increases in a DBD reactor, there will be a corresponding increase in reaction time and consequently, the pollutant removal efficiency can increase. However, pollutant formation can occur during increased mean residence time and so the proportion of fluid that may remain for periods significantly longer than the mean residence time is of great importance. In this study, first, the residence time distribution is calculated based on the standard reactor used by the authors for ultrafine particle (10-500 nm) removal. Then, different geometrics and various inlet velocities are considered. Finally, for selected cases, some roughness elements added inside the reactor and the residence time is calculated. These results will form the basis for a COMSOL plasma and CFD module investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wires of YBa2Cu3O7-x were fabricated by extrusion using a hydroxypropyl methylcellulose (HPMC) binder. As little as 2 wt.% binder was added to an oxide prepared by a novel co-precipitation process, to produce a plastic mass which readily gave continuous extrusion of long lengths of wire in a reproducible fashion. Critical temperatures of 92K were obtained for wires given optimum high-temperature heat treatments. Critical current densities greater than 1000 A cm-1 were measured at 77.3K using heat treatments at around 910°C for 10h. These transport critical current densities, measured on centimeter-long wires, were obtained with microstructures showing a relatively dense and uniform distribution of randomly oriented, small YBa2Cu3O7-x grains. © 1993.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 1 December 2002, the New Zealand Exchange’s (NZX) continuous disclosure listing rules have operated with statutory backing. To test the effectiveness of the new corporate disclosure regime, we compare the change in quantity of market announcements (overall, non-routine, non-procedural and external) released to the NZX before and after the introduction of statutory backing. We also extend our study in investigating whether the effectiveness of the new corporate disclosure regime is diminished or augmented by corporate governance mechanisms including board size, providing separate roles for CEO and Chairman, board independence, board gender diversity and audit committee independence. Our findings provide a qualified support for the effectiveness of the new corporate disclosure regime regarding the quantity of market disclosures. There is strong evidence that the effectiveness of the new corporate disclosure regime was augmented by providing separate roles for CEO and Chairman, board gender diversity and audit committee independence, and diminished by board size. In addition, there is significant evidence that share price queries do impact corporate disclosure behaviour and this impact is significantly influenced by corporate governance mechanisms. Our findings provide important implications for corporate regulators in their quest for...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 1 December 2002, the New Zealand Exchange’s (NZX) continuous disclosure listing rules have operated with statutory backing. To test the effectiveness of the new corporate disclosure regime, we compare the change in quantity of market announcements (overall, non-routine, non-procedural and external) released to the NZX before and after the introduction of statutory backing. We also extend our study in investigating whether the effectiveness of the new corporate disclosure regime is diminished or augmented by corporate governance mechanisms including board size, providing separate roles for CEO and Chairman, board independence, board gender diversity and audit committee independence. Our findings provide a qualified support for the effectiveness of the new corporate disclosure regime regarding the quantity of market disclosures. There is strong evidence that the effectiveness of the new corporate disclosure regime was augmented by providing separate roles for CEO and Chairman, board gender diversity and audit committee independence, and diminished by board size. In addition, there is significant evidence that share price queries do impact corporate disclosure behaviour and this impact is significantly influenced by corporate governance mechanisms. Our findings provide important implications for corporate regulators in their quest for a superior disclosure regime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nonthermal plasma (NTP) treatment of exhaust gas is a promising technology for both nitrogen oxides (NOX) and particulate matter (PM) reduction by introducing plasma into the exhaust gases. This paper considers the effect of NTP on PM mass reduction, PM size distribution, and PM removal efficiency. The experiments are performed on real exhaust gases from a diesel engine. The NTP is generated by applying high-voltage pulses using a pulsed power supply across a dielectric barrier discharge (DBD) reactor. The effects of the applied high-voltage pulses up to 19.44 kVpp with repetition rate of 10 kHz are investigated. In this paper, it is shown that the PM removal and PM size distribution need to be considered both together, as it is possible to achieve high PM removal efficiency with undesirable increase in the number of small particles. Regarding these two important factors, in this paper, 17 kVpp voltage level is determined to be an optimum point for the given configuration. Moreover, particles deposition on the surface of the DBD reactor is found to be a significant phenomenon, which should be considered in all plasma PM removal tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Lake Wivenhoe Integrated Wireless Sensor Network is conceptually similar to traditional SCADA monitoring and control approaches. However, it is applied in an open system using wireless devices to monitor processes that affect water quality at both a high spatial and temporal frequency. This monitoring assists scientists to better understand drivers of key processes that influence water quality and provide the operators with an early warning system if below standard water enters the reservoir. Both of these aspects improve the safety and efficient delivery of drinking water to the end users.