867 resultados para Expected satiety


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, atmospheric-pressure plasmas were applied to modify the surface of silane-coated silica nanoparticles. Subsequently nanocomposites were synthesized by incorporating plasma-treated nanoparticles into an epoxy resin matrix. Electrical testing showed that such novel dielectric materials obtained high partial discharge resistance, high dielectric breakdown strength, and enhanced endurance under highly stressed electric field. Through spectroscopic and microscopic analysis, we found surface groups of nanoparticles were activated and radicals were created after the plasma treatment. Moreover, a uniform dispersion of nanoparticles in nanocomposites was observed. It was expected that the improved dielectric performance of the nanocomposites can attribute to stronger chemical bonds formed between surface groups of plasma-treated nanoparticles and molecules in the matrix. This simple yet effective and environmentally friendly approach aims to synthesize the next generation of high-performance nanocomposite dielectric insulation materials for applications in high-voltage power systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2009, the Capital Markets Development Authority (CMDA) - Fiji’s capital market regulator - introduced the Code of Corporate Governance (the Code). The Code is ‘principle-based’ and requires companies listed on the South Pacific Stock Exchange (SPSE) and the financial intermediaries to disclose their compliance with the Code’s principles. While compliance with the Code is mandatory, the nature and extent of disclosure is at the discretion of the complying entities. Agency theory and signalling theory suggest that firms with higher expected levels of agency costs will provide greater levels of voluntary disclosures as signals of strong corporate governance. Thus, the study seeks to test these theories by examining the heterogeneity of corporate governance disclosures by firms listed on SPSE, and determining the characteristics of firms that provide similar levels of disclosures. We conducted a content analysis of corporate governance disclosures on the annual reports of firms from 2008-2012. The study finds that large, non-family owned firms with high levels of shareholder dispersion provide greater quantity and higher quality corporate governance disclosures. For firms that are relatively smaller, family owned and have low levels of shareholder dispersion, the quantity and quality of corporate governance disclosures are much lower. Some of these firms provide boilerplate disclosures with minimal changes in the following years. These findings support the propositions of agency and signalling theory, which suggest that firms with higher separation between agents and principals will provide more voluntary disclosures to reduce expected agency costs transfers. Semi-structured interviews conducted with key stakeholders further reinforce the findings. The interviews also reveal that complying entities positively perceive the introduction of the Code. Furthermore, while compliance with Code brought about additional costs, they believed that most of these costs were minimal and one-off, and the benefits of greater corporate disclosure to improve user decision making outweighed the costs. The study contributes to the literature as it provides insight into the experience of a small capital market with introducing a ‘principle-based’ Code that attempts to encourage corporate governance practices through enhanced disclosure. The study also assists policy makers better understand complying entities’ motivations for compliance and the extent of compliance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature around Library 2.0 remains largely theoretical with few empirically studies and is particularly limited in developing countries such as Indonesia. This study addresses this gap and aims to provide information about the current state of knowledge on Indonesian LIS professionals’ understanding of Library 2.0. The researchers used qualitative and quantitative approaches for this study, asking thirteen closed- and open-ended questions in an online survey. The researchers used descriptive and in vivo coding to analyze the responses. Through their analysis, they identified three themes: technology, interactivity, and awareness of Library 2.0. Respondents demonstrated awareness of Library 2.0 and a basic understanding of the roles of interactivity and technology in libraries. However, overreliance on technology used in libraries to conceptualize Library 2.0 without an emphasis on its core characteristics and principles could lead to the misalignment of limited resources. The study results will potentially strengthen the research base for Library 2.0 practice, as well as inform LIS curriculum in Indonesia so as to develop practitioners who are able to adapt to users’ changing needs and expectations. It is expected that the preliminary data of this study could be used to design a much larger and more complex future research project in this area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plasma nanoscience is an emerging multidisciplinary research field at the cutting edge of a large number of disciplines including but not limited to physics and chemistry of plasmas and gas discharges, materials science, surface science, nanoscience and nanotechnology, solid-state physics, space physics and astrophysics, photonics, optics, plasmonics, spintronics, quantum information, physical chemistry, biomedical sciences and related engineering subjects. This paper examines the origin, progress and future perspectives of this research field driven by the global scientific and societal challenges. The future potential of plasma nanoscience to remain a highly topical area in the global research and technological agenda in the age of fundamental-level control for a sustainable future is assessed using a framework of the five Grand Challenges for Basic Energy Sciences recently mapped by the US Department of Energy. It is concluded that the ongoing research is very relevant and is expected to substantially expand to competitively contribute to the solution of all of these Grand Challenges. The approach to controlling energy and matter at nano- and subnanoscales is based on identifying the prevailing carriers and transfer mechanisms of the energy and matter at the spatial and temporal scales that are most relevant to any particular nanofabrication process. Strong accent is made on the competitive edge of the plasma-based nanotechnology in applications related to the major socio-economic issues (energy, food, water, health and environment) that are crucial for a sustainable development of humankind. Several important emerging topics, opportunities and multidisciplinary synergies for plasma nanoscience are highlighted. The main nanosafety issues are also discussed and the environment- and human health-friendly features of plasma-based nanotech are emphasized.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review paper presents historical perspectives, recent advances and future directions in the multidisciplinary research field of plasma nanoscience. The current status and future challenges are presented using a three-dimensional framework. The first and the largest dimension covers the most important classes of nanoscale objects (nanostructures, nanofeatures and nanoassemblies/nanoarchitectures) and materials systems, namely carbon nanotubes, nanofibres, graphene, graphene nanoribbons, graphene nanoflakes, nanodiamond and related carbon-based nanostructures; metal, silicon and other inorganic nanoparticles and nanostructures; soft organic nanomaterials; nano-biomaterials; biological objects and nanoscale plasma etching. In the second dimension, we discuss the most common types of plasmas and plasma reactors used in nanoscale plasma synthesis and processing. These include low-temperature non-equilibrium plasmas at low and high pressures, thermal plasmas, high-pressure microplasmas, plasmas in liquids and plasma–liquid interactions, high-energy-density plasmas, and ionized physical vapour deposition as well as some other plasma-enhanced nanofabrication techniques. In the third dimension, we outline some of the 'Grand Science Challenges' and 'Grand Socio-economic Challenges' to which significant contributions from plasma nanoscience-related research can be expected in the near future. The urgent need for a stronger focus on practical, outcome-oriented research to tackle the grand challenges is emphasized and concisely formulated as from controlled complexity to practical simplicity in solving grand challenges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ROBERT EVAPORATORS in Australian sugar factories are traditionally constructed with 44.45 mm outside diameter stainless steel tubes of ~2 m length for all stages of evaporation. There are a few vessels with longer tubes (up to 2.8 m) and smaller and larger diameters (38.1 and 50.8 mm). Queensland University of Technology is undertaking a study to investigate the heat transfer performance of tubes of different lengths and diameters for the whole range of process conditions typically encountered in the evaporator set. Incorporation of these results into practical evaporator designs requires an understanding of the cost implications for constructing evaporator vessels with calandrias having tubes of different dimensions. Cost savings are expected for tubes of smaller diameter and longer length in terms of material, labour and installation costs in the factory. However these savings must be considered in terms of the heat transfer area requirements for the evaporation duty, which will likely be a function of the tube dimensions. In this paper a capital cost model is described which provides a relative cost of constructing and installing Robert evaporators of the same heating surface area but with different tube dimensions. Evaporators of 2000, 3000, 4000 and 5000 m2 are investigated. This model will be used in conjunction with the heat transfer efficiency data (when available) to determine the optimum tube dimensions for a new evaporator at a specified evaporation duty. Consideration is also given to other factors such as juice residence time (and implications for sucrose degradation and control) and droplet de-entrainment in evaporators of different tube dimensions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The routine cultivation of human corneal endothelial cells, with the view to treating patients with endothelial dysfunction, remains a challenging task. While progress in this field has been buoyed by the proposed existence of progenitor cells for the corneal endothelium at the corneal limbus, strategies for exploiting this concept remain unclear. In the course of evaluating methods for growing corneal endothelial cells, we have noted a case where remarkable growth was achieved using a serial explant culture technique. Over the course of 7 months, a single explant of corneal endothelium, acquired from cadaveric human tissue, was sequentially seeded into 7 culture plates and on each occasion produced a confluent cell monolayer. Sample cultures were confirmed as endothelial in origin by positive staining for glypican-4. On each occasion, small cells, closest to the tissue explant, developed into a highly compact layer with an almost homogenous structure. This layer was resistant to removal with trypsin and produced continuous cell outgrowth during multiple culture periods. The small cells gave rise to larger cells with phase-bright cell boundaries and prominent immunostaining for both nestin and telomerase. Nestin and telomerase were also strongly expressed in small cells immediately adjacent to the wound site, following transfer of the explant to another culture plate. These findings are consistent with the theory that progenitor cells for the corneal endothelium reside within the limbus and provide new insights into expected expression patterns for nestin and telomerase within the differentiation pathway.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To evaluate the potential impact of the current global economic crisis (GEC) on the spread of HIV. Design To evaluate the impact of the economic downturn we studied two distinct HIV epidemics in Southeast Asia: the generalized epidemic in Cambodia where incidence is declining and the epidemic in Papua New Guinea (PNG) which is in an expansion phase. Methods Major HIV-related risk factors that may change due to the GEC were identified and a dynamic mathematical transmission model was developed and used to forecast HIV prevalence, diagnoses, and incidence in Cambodia and PNG over the next 3 years. Results In Cambodia, the total numbers of HIV diagnoses are not expected to be largely affected. However, an estimated increase of up to 10% in incident cases of HIV, due to potential changes in behavior, may not be observed by the surveillance system. In PNG, HIV incidence and diagnoses could be more affected by the GEC, resulting in respective increases of up to 17% and 11% over the next 3 years. Decreases in VCT and education programs are the factors that may be of greatest concern in both settings. A reduction in the rollout of antiretroviral therapy could increase the number of AIDS-related deaths (by up to 7.5% after 3 years). Conclusions The GEC is likely to have a modest impact on HIV epidemics. However, there are plausible conditions under which the economic downturns can noticeably influence epidemic trends. This study highlights the high importance of maintaining funding for HIV programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radial profiles of magnetic fields in the electrostatic (E) and electromagnetic (H) modes of low-frequency (∼500) inductively coupled plasmas (ICP) were measured using miniature magnetic probes. A simplified plasma fluid model explaining the generation of the second harmonics of the azimuthal magnetic field in the plasma source was proposed. Because of apparent similarity in the procedure of derivation of the pondermotive force-caused nonlinear terms, pronounced generation of the nonlinear static azimuthal magnetic field could be expected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the renewable energy sources whose outputs vary continuously, a Z-source current-type inverter has been proposed as a possible buck-boost alternative for grid-interfacing. With a unique X-shaped LC network connected between its dc power source and inverter topology, Z-source current-type inverter is however expected to suffer from compounded resonant complications in addition to those associated with its second-order output filter. To improve its damping performance, this paper proposes the careful integration of Posicast or three-step compensators before the inverter pulse-width modulator for damping triggered resonant oscillations. In total, two compensators are needed for wave-shaping the inverter boost factor and modulation ratio, and they can conveniently be implemented using first-in first-out stacks and embedded timers of modern digital signal processors widely used in motion control applications. Both techniques are found to damp resonance of ac filter well, but for cases of transiting from current-buck to boost state, three-step technique is less effective due to the sudden intermediate discharging interval introduced by its non-monotonic stepping (unlike the monotonic stepping of Posicast damping). These findings have been confirmed both in simulations and experiments using an implemented laboratory prototype.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-party key agreement protocols indirectly assume that each principal equally contributes to the final form of the key. In this paper we consider three malleability attacks on multi-party key agreement protocols. The first attack, called strong key control allows a dishonest principal (or a group of principals) to fix the key to a pre-set value. The second attack is weak key control in which the key is still random, but the set from which the key is drawn is much smaller than expected. The third attack is named selective key control in which a dishonest principal (or a group of dishonest principals) is able to remove a contribution of honest principals to the group key. The paper discusses the above three attacks on several key agreement protocols, including DH (Diffie-Hellman), BD (Burmester-Desmedt) and JV (Just-Vaudenay). We show that dishonest principals in all three protocols can weakly control the key, and the only protocol which does not allow for strong key control is the DH protocol. The BD and JV protocols permit to modify the group key by any pair of neighboring principals. This modification remains undetected by honest principals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis investigates “where were the auditors in asset securitizations”, a criticism of the audit profession before and after the onset of the global financial crisis (GFC). Asset securitizations increase audit complexity and audit risks, which are expected to increase audit fees. Using US bank holding company data from 2003 to 2009, this study examines the association between asset securitization risks and audit fees, and its changes during the global financial crisis. The main test is based on an ordinary least squares (OLS) model, which is adapted from the Fields et al. (2004) bank audit fee model. I employ a principal components analysis to address high correlations among asset securitization risks. Individual securitization risks are also separately tested. A suite of sensitivity tests indicate the results are robust. These include model alterations, sample variations, further controls in the tests, and correcting for the securitizer self-selection problem. A partial least squares (PLS) path modelling methodology is introduced as a separate test, which allows for high intercorrelations, self-selection correction, and sequential order hypotheses in one simultaneous model. The PLS results are consistent with the main results. The study finds significant and positive associations between securitization risks and audit fees. After the commencement of the global financial crisis in 2007, there was an increased focus on the role of audits on asset securitization risks resulting from bank failures; therefore I expect that auditors would become more sensitive to bank asset securitization risks after the commencement of the crisis. I find that auditors appear to focus on different aspects of asset securitization risks during the crisis and that auditors appear to charge a GFC premium for banks. Overall, the results support the view that auditors consider asset securitization risks and market changes, and adjust their audit effort and risk considerations accordingly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides a preliminary analysis of an autonomous uncooperative collision avoidance strategy for unmanned aircraft using image-based visual control. Assuming target detection, the approach consists of three parts. First, a novel decision strategy is used to determine appropriate reference image features to track for safe avoidance. This is achieved by considering the current rules of the air (regulations), the properties of spiral motion and the expected visual tracking errors. Second, a spherical visual predictive control (VPC) scheme is used to guide the aircraft along a safe spiral-like trajectory about the object. Lastly, a stopping decision based on thresholding a cost function is used to determine when to stop the avoidance behaviour. The approach does not require estimation of range or time to collision, and instead relies on tuning two mutually exclusive decision thresholds to ensure satisfactory performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Slippage in the contact roller-races has always played a central role in the field of diagnostics of rolling element bearings. Due to this phenomenon, vibrations triggered by a localized damage are not strictly periodic and therefore not detectable by means of common spectral functions as power spectral density or discrete Fourier transform. Due to the strong second order cyclostationary component, characterizing these signals, techniques such as cyclic coherence, its integrated form and square envelope spectrum have proven to be effective in a wide range of applications. An expert user can easily identify a damage and its location within the bearing components by looking for particular patterns of peaks in the output of the selected cyclostationary tool. These peaks will be found in the neighborhood of specific frequencies, that can be calculated in advance as functions of the geometrical features of the bearing itself. Unfortunately the non-periodicity of the vibration signal is not the only consequence of the slippage: often it also involves a displacement of the damage characteristic peaks from the theoretically expected frequencies. This issue becomes particularly important in the attempt to develop highly automated algorithms for bearing damage recognition, and, in order to correctly set thresholds and tolerances, a quantitative description of the magnitude of the above mentioned deviations is needed. This paper is aimed at identifying the dependency of the deviations on the different operating conditions. This has been possible thanks to an extended experimental activity performed on a full scale bearing test rig, able to reproduce realistically the operating and environmental conditions typical of an industrial high power electric motor and gearbox. The importance of load will be investigated in detail for different bearing damages. Finally some guidelines on how to cope with such deviations will be given, accordingly to the expertise obtained in the experimental activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Safety at railway level crossings (RLX) is one part of a wider picture of safety within the whole transport system. Governments, the rail industry and road organisations have used a variety of countermeasures for many years to improve RLX safety. New types of interventions are required in order to reduce the number of crashes and associated social costs at railway crossings. This paper presents the results of a large research program which aimed to assess the effectiveness of emerging Intelligent Transport Systems (ITS) interventions, both on-road and in-vehicle based, to improve the safety of car drivers at RLXs in Australia. The three most promising technologies selected from the literature review and focus groups were tested in an advanced driving simulator to provide a detailed assessment of their effects on driver behaviour. The three interventions were: (i) in-vehicle visual warning using a GPS/smartphone navigation-like system, (ii) in-vehicle audio warning and; (iii) on-road intervention known as valet system (warning lights on the road surface activated as a train approaches). The effects of these technologies on 57 participants were assessed in a systematic approach focusing on the safety of the intervention, effects on the road traffic around the crossings and driver’s acceptance of the technology. Given that the ITS interventions were likely to provide a benefit by improving the driver’s awareness of the crossing status in low visibility conditions, such conditions were investigated through curves in the track before arriving at the crossing. ITS interventions were also expected to improve driver behaviour at crossings with high traffic (blocking back issue), which were also investigated at active crossings. The key findings are: (i) interventions at passive crossings are likely to provide safety benefits; (ii) the benefits of ITS interventions on driver behaviour at active crossings are limited; (iii) the trialled ITS interventions did not show any issues in terms of driver distraction, driver acceptance or traffic delays; (iv) these interventions are easy to use, do not increase driver workload substantially; (v) participants’ intention to use the technology is high and; (vi) participants saw most value in succinct messages about approaching trains as opposed to knowing the RLX locations or the imminence of a collision with a train.