991 resultados para Facilitating conditions
Resumo:
A significant amount of speech data is required to develop a robust speaker verification system, but it is difficult to find enough development speech to match all expected conditions. In this paper we introduce a new approach to Gaussian probabilistic linear discriminant analysis (GPLDA) to estimate reliable model parameters as a linearly weighted model taking more input from the large volume of available telephone data and smaller proportional input from limited microphone data. In comparison to a traditional pooled training approach, where the GPLDA model is trained over both telephone and microphone speech, this linear-weighted GPLDA approach is shown to provide better EER and DCF performance in microphone and mixed conditions in both the NIST 2008 and NIST 2010 evaluation corpora. Based upon these results, we believe that linear-weighted GPLDA will provide a better approach than pooled GPLDA, allowing for the further improvement of GPLDA speaker verification in conditions with limited development data.
Resumo:
The reduction of the health literacy concept to a functional relationship with text, does not acknowledge the range of information sources that people draw from in order to make informed decision about their health and treatment. Drawing from two studies that explored how people with two different but complex and life-threatening chronic health conditions, chronic kidney disease and HIV, a socio-cultural understanding of the practise of health literacy is described. Health information is experienced by patients as a chronic health condition landscape, and develops from three information sources; namely epistemic, social and corporeal sources. Participants in both studies used activities that involved orienting, sharing and creating information to map this landscape which was used to inform their decision-making. These findings challenge the traditional conceptions of health literacy and suggest an approach that views the landscape of chronic illness as being socially, physically and contextually constructed. This approach necessitates a recasting of health literacy away from a sole interest in skills and towards understanding how information practices facilitate people becoming health literate.
Resumo:
The current global economic instability and the vulnerability of small island nations are providing the impetus for greater integration between the countries of the South Pacific region. This exercise is critical for their survival in today’s turbulent economic environment. Past efforts of regional integration in the South Pacific have not been very successful. Reasons attributed to this outcome include issues related to damage of sovereignty, and lack of a shared integration infrastructure. Today, the IT resources with collaborative capacities provide the opportunity to develop a shared IT infrastructure to facilitate integration in the South Pacific. In an attempt to develop a model of regional integration with an IT-backed infrastructure, we identify and report on the antecedents of the current stage of regional integration, and the stakeholders’ perceived benefits of an IT resources backed regional integration in the South Pacific. Employing a case study based approach, the study finds that while most stakeholders were positive about the potential of IT-backed regional integration, significant challenges exist that hinder the realisation of this model. The study finds that facilitating IT-backed regional integration requires enabling IT infrastructure, equitable IT development in the region, greater awareness on the potential of the modern IT resources, market liberalisation of the information and telecommunications sector and greater political support for IT initiatives.
Resumo:
This article examines the conditions of penal hope behind suggestions that the penal expansionism of the last three decades may be at a ‘turning point’. The article proceeds by outlining David Green’s (2013b) suggested catalysts of penal reform and considers how applicable they are in the Australian context. Green’s suggested catalysts are: the cycles and saturation thesis; shifts in the dominant conception of the offender; the global financial crisis (GFC) and budgetary constraints; the drop in crime; the emergence of the prisoner re‐entry movement; apparent shifts in public opinion; the influence of evangelical Christian ideas; and the Right on Crime initiative. The article then considers a number of other possible catalysts or forces: the role of trade unions; the role of courts; the emergence of recidivism as a political issue; the influence of ‘evidence based’/‘what works’ discourse; and the emergence of justice reinvestment (JR). The article concludes with some comments about the capacity of criminology and criminologists to contribute to penal reductionism, offering an optimistic assessment for the prospects of a reflexive criminology that engages in and engenders a wider politics around criminal justice issues.
Resumo:
The ability of the technique of large-amplitude Fourier transformed (FT) ac voltammetry to facilitate the quantitative evaluation of electrode processes involving electron transfer and catalytically coupled chemical reactions has been evaluated. Predictions derived on the basis of detailed simulations imply that the rate of electron transfer is crucial, as confirmed by studies on the ferrocenemethanol (FcMeOH)-mediated electrocatalytic oxidation of ascorbic acid. Thus, at glassy carbon, gold, and boron-doped diamond electrodes, the introduction of the coupled electrocatalytic reaction, while producing significantly enhanced dc currents, does not affect the ac harmonics. This outcome is as expected if the FcMeOH (0/+) process remains fully reversible in the presence of ascorbic acid. In contrast, the ac harmonic components available from FT-ac voltammetry are predicted to be highly sensitive to the homogeneous kinetics when an electrocatalytic reaction is coupled to a quasi-reversible electron-transfer process. The required quasi-reversible scenario is available at an indium tin oxide electrode. Consequently, reversible potential, heterogeneous charge-transfer rate constant, and charge-transfer coefficient values of 0.19 V vs Ag/AgCl, 0.006 cm s (-1) and 0.55, respectively, along with a second-order homogeneous chemical rate constant of 2500 M (-1) s (-1) for the rate-determining step in the catalytic reaction were determined by comparison of simulated responses and experimental voltammograms derived from the dc and first to fourth ac harmonic components generated at an indium tin oxide electrode. The theoretical concepts derived for large-amplitude FT ac voltammetry are believed to be applicable to a wide range of important solution-based mediated electrocatalytic reactions.
Resumo:
We demonstrate that a three dimensional (3D) crystalline tungsten trioxide (WO3) nanoporous network, directly grown on a transparent conductive oxide (TCO) substrate, is a suitable working electrode material for high performance electrochromic devices. This nanostructure, with achievable thicknesses of up to 2 μm, is prepared at room temperature by the electrochemical anodization of a RF-sputtered tungsten film deposited on a fluoride doped tin oxide (FTO) conductive glass, under low applied anodic voltages and mild chemical dissolution conditions. For the crystalline nanoporous network with thicknesses ranging from 0.6 to 1 μm, impressive coloration efficiencies of up to 141.5 cm2 C−1 are achieved by applying a low coloration voltage of −0.25 V. It is also observed that there is no significant degradation of the electrochromic properties of the porous film after 2000 continuous coloration–bleaching cycles. The remarkable electrochromic characteristics of this crystalline and nanoporous WO3 are mainly ascribed to the combination of a large surface area, facilitating increased intercalation of protons, as well as excellent continuous and directional paths for charge transfer and proton migration in the highly crystalline material.
Resumo:
In order to establish the influence of the drying air characteristics on the drying performance and fluidization quality of bovine intestine for pet food, several drying tests have been carried out in a laboratory scale heat pump assisted fluid bed dryer. Bovine intestine samples were heat pump fluidized bed dried at atmospheric pressure and at temperatures below and above the materials freezing points, equipped with a continuous monitoring system. The investigation of the drying characteristics have been conducted in the temperature range −10 to 25 ◦C and the airflow in the range 1.5–2.5 m/s. Some experiments were conducted as single temperature drying experiments and others as two stage drying experiments employing two temperatures. An Arrhenius-type equation was used to interpret the influence of the drying air temperature on the effective diffusivity, calculated with the method of slopes in terms of energy activation, and this was found to be sensitive to the temperature. The effective diffusion coefficient of moisture transfer was determined by the Fickian method using uni-dimensional moisture movement in both moisture, removal by evaporation and combined sublimation and evaporation. Correlations expressing the effective moisture diffusivity and drying temperature are reported. Bovine particles were characterized according to the Geldart classification and the minimum fluidization velocity was calculated using the Ergun Equation and generalized equation for all drying conditions at the beginning and end of the trials. Walli’s model was used to categorize stability of the fluidization at the beginning and end of the dryingv for each trial. The determined Walli’s values were positive at the beginning and end of all trials indicating stable fluidization at the beginning and end for each drying condition.
Resumo:
Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.
Resumo:
Recent studies have demonstrated that angiogenesis and suppressed cell- mediated immunity (CMI) play a central role in the pathogenesis of malignant disease facilitating tumour growth, invasion and metastasis. In the majority of tumours, the malignant process is preceded by a pathological condition or exposure to an irritant which itself is associated with the induction of angiogenesis and/or suppressed CMI. These include: cigarette smoking, chronic bronchitis and lung cancer; chronic oesophagitis and oesophageal cancer; chronic viral infections such as human papilloma virus and ano-genital cancers, chronic hepatitis B and C and hepatocellular carcinoma, and Epstein- Barr virus (EBV) and lymphomas; chronic inflammatory conditions such as Crohn's disease and ulcerative colitis and colorectal cancer; asbestos exposure and mesothelioma and excessive sunlight exposure/sunburn and malignant melanoma. Chronic exposure to growth factors (insulin-like growth factor-I in acromegaly), mutations in tumour suppressor genes (TP53 in Li Fraumeni syndrome) and long-term exposure to immunosuppressive agents (cyclosporin A) may also give rise to similar environments and are associated with the development of a range of solid tumours. The increased blood supply would facilitate the development and proliferation of an abnormal clone or clones of cells arising as the result of: (a) an inherited genetic abnormality; and/or (b) acquired somatic mutations, the latter due to local production and/or enhanced delivery of carcinogens and mutagenic growth factors. With progressive detrimental mutations and growth-induced tumour hypoxia, the transformed cell, to a lesser or greater extent, may amplify the angiogenic process and CMI suppression, thereby facilitating further tumour growth and metastasis. There is accumulating evidence that long-term treatment with cyclo-oxygenase inhibitors (aspirin and indomethacin), cytokines such as interferon-α, anti-oestrogens (tamoxifen and raloxifene) and captopril significantly reduces the incidence of solid tumours such as breast and colorectal cancer. These agents are anti-angiogenic and, in the case of aspirin, indomethacin and interferon-α have proven immunomodulatory effects. Collectively these observations indicate that angiogenesis and suppressed CMI play a central role in the development and progression of malignant disease. (C) 2000 Elsevier Science Ltd.
Resumo:
There is a growing trend to offer students learning opportunities that are flexible, innovative and engaging. As educators embrace student-centred agile teaching and learning methodologies, which require continuous reflection and adaptation, the need to evaluate students’ learning in a timely manner has become more pressing. Conventional evaluation surveys currently dominate the evaluation landscape internationally, despite recognition that they are insufficient to effectively evaluate curriculum and teaching quality. Surveys often: (1) fail to address the issues for which educators need feedback, (2) constrain student voice, (3) have low response rates and (4) occur too late to benefit current students. Consequently, this paper explores principles of effective feedback to propose a framework for learner-focused evaluation. We apply a three-stage control model, involving feedforward, concurrent and feedback evaluation, to investigate the intersection of assessment and evaluation in agile learning environments. We conclude that learner-focused evaluation cycles can be used to guide action so that evaluation is not undertaken simply for the benefit of future offerings, but rather to benefit current students by allowing ‘real-time’ learning activities to be adapted in the moment. As a result, students become co-producers of learning and evaluation becomes a meaningful, responsive dialogue between students and their instructors.
Resumo:
Spatial organisation of proteins according to their function plays an important role in the specificity of their molecular interactions. Emerging proteomics methods seek to assign proteins to sub-cellular locations by partial separation of organelles and computational analysis of protein abundance distributions among partially separated fractions. Such methods permit simultaneous analysis of unpurified organelles and promise proteome-wide localisation in scenarios wherein perturbation may prompt dynamic re-distribution. Resolving organelles that display similar behavior during a protocol designed to provide partial enrichment represents a possible shortcoming. We employ the Localisation of Organelle Proteins by Isotope Tagging (LOPIT) organelle proteomics platform to demonstrate that combining information from distinct separations of the same material can improve organelle resolution and assignment of proteins to sub-cellular locations. Two previously published experiments, whose distinct gradients are alone unable to fully resolve six known protein-organelle groupings, are subjected to a rigorous analysis to assess protein-organelle association via a contemporary pattern recognition algorithm. Upon straightforward combination of single-gradient data, we observe significant improvement in protein-organelle association via both a non-linear support vector machine algorithm and partial least-squares discriminant analysis. The outcome yields suggestions for further improvements to present organelle proteomics platforms, and a robust analytical methodology via which to associate proteins with sub-cellular organelles.
Resumo:
Irradiance profile around the receiver tube (RT) of a parabolic trough collector (PTC) is a key effect of optical performance that affects the overall energy performance of the collector. Thermal performance evaluation of the RT relies on the appropriate determination of the irradiance profile. This article explains a technique in which empirical equations were developed to calculate the local irradiance as a function of angular location of the RT of a standard PTC using a vigorously verified Monte Carlo ray tracing model. A large range of test conditions including daily normal insolation, spectral selective coatings and glass envelop conditions were selected from the published data by Dudley et al. [1] for the job. The R2 values of the equations are excellent that vary in between 0.9857 and 0.9999. Therefore, these equations can be used confidently to produce realistic non-uniform boundary heat flux profile around the RT at normal incidence for conjugate heat transfer analyses of the collector. Required values in the equations are daily normal insolation, and the spectral selective properties of the collector components. Since the equations are polynomial functions, data processing software can be employed to calculate the flux profile very easily and quickly. The ultimate goal of this research is to make the concentrating solar power technology cost competitive with conventional energy technology facilitating its ongoing research.
Resumo:
The diagnostics of mechanical components operating in transient conditions is still an open issue, in both research and industrial field. Indeed, the signal processing techniques developed to analyse stationary data are not applicable or are affected by a loss of effectiveness when applied to signal acquired in transient conditions. In this paper, a suitable and original signal processing tool (named EEMED), which can be used for mechanical component diagnostics in whatever operating condition and noise level, is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED) and the analytical approach of the Hilbert transform. The proposed tool is able to supply diagnostic information on the basis of experimental vibrations measured in transient conditions. The tool has been originally developed in order to detect localized faults on bearings installed in high speed train traction equipments and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on spectral kurtosis or envelope analysis, which represent until now the landmark for bearings diagnostics.
Resumo:
In the field of rolling element bearing diagnostics envelope analysis, and in particular the squared envelope spectrum, have gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of squared envelope spectrum has been extended to cases in which small speed fluctuations occur, maintaining the effectiveness and efficiency that characterize this successful technique. However, the constraint on speed has to be removed completely, making envelope analysis suitable also for speed and load transients, to implement an algorithm valid for all the industrial application. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This paper is aimed at providing and testing a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.
Resumo:
Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.