991 resultados para optimize


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1963, the National Institutes of Health (NIH) first issued guidelines for animal housing and husbandry. The most recent 2010 revision emphasizes animal care “in ways judged to be scientifically, technically, and humanely appropriate” (National Institutes of Health, 2010, p. XIII). The goal of these guidelines is to ensure humanitarian treatment of animals and to optimize the quality of research. Although these animal care guidelines cover a substantial amount of information regarding animal housing and husbandry, researchers generally do not report all these variables (see Table ​Table1).1). The importance of housing and husbandry conditions with respect to standardization across different research laboratories has been debated previously (Crabbe et al., 1999; Van Der Staay and Steckler, 2002; Wahlsten et al., 2003; Wolfer et al., 2004; Van Der Staay, 2006; Richter et al., 2010, 2011). This paper focuses on several animal husbandry and housing issues that are particularly relevant to stress responses in rats, including transportation, handling, cage changing, housing conditions, light levels and the light–dark cycle. We argue that these key animal housing and husbandry variables should be reported in greater detail in an effort to raise awareness about extraneous experimental variables, especially those that have the potential to interact with the stress response.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To optimize the animal model of liver injury that can properly represent the pathological characteristics of dampness-heat jaundice syndrome of traditional Chinese medicine. METHODS: The liver injury in the model rat was induced by alpha-naphthylisothiocyanate (ANIT) and carbon tetrachloride (CCl(4) ) respectively, and the effects of Yinchenhao Decoction (, YCHD), a proved effective Chinese medical formula for treating the dampness-heat jaundice syndrome in clinic, on the two liver injury models were evaluated by analyzing the serum level of alanine aminotransferase (ALT), asparate aminotransferase (AST), alkaline phosphatase (ALP), malondialchehyche (MDA), total bilirubin (T-BIL), superoxide dismutase (SOD), glutathione peroxidase (GSH-PX) as well as the ratio of liver weight to body weight. The experimental data were analyzed by principal component analytical method of pattern recognition. RESULTS: The ratio of liver weight to body weight was significantly elevated in the ANIT and CCl(4) groups when compared with that in the normal control (P<0.01). The contents of ALT and T-BIL were significantly higher in the ANIT group than in the normal control (P<0.05,P<0.01), and the levels of AST, ALT and ALP were significantly elevated in CCl(4) group relative to those in the normal control P<0.01). In the YCHD group, the increase in AST, ALT and ALP levels was significantly reduced (P<0.05, P<0.01), but with no significant increase in serum T-BIL. In the CCl(4) intoxicated group, the MDA content was significantly increased and SOD, GSH-PX activities decreased significantly compared with those in the normal control group, respectively (P<0.01). The increase in MDA induced by CCl(4) was significantly reduced by YCHD P<0.05). CONCLUSION: YCHD showed significant effects on preventing liver injury progression induced by CCl(4), and the closest or most suitable animal model for damp-heat jaundice syndrome may be the one induced by CCl(4).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis analyses the performance bounds of amplify-and-forward relay channels which are becoming increasingly popular in wireless communication applications. The statistics of cascaded Nakagami-m fading model which is a major obstacle in evaluating the outage of wireless networks is analysed using Mellin transform. Furthermore, the upper and the lower bounds for the ergodic capacity of the slotted amplify-and-forward relay channel, for finite and infinite number of relays are derived using random matrix theory. The results obtained will enable wireless network designers to optimize the network resources, benefiting the consumers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two varieties of grapes, white grape and red grape grown in the Campania region of Italy were selected for the study of drying characteristics, moisture diffusion, quality changes (colour) and shrinkage behaviour. Comparisons were made with treated and untreated grapes under constant drying condition of 50o C in a conventional drying system. This temperature was selected to represent farm drying conditions. Grapes were purchased from a local market from the same supplier to maintain the same size of grapes and same properties. An abrasive physical treatment was used as pretreatment. The drying curves were constructed and drying kinetics was calculated using several commonly available models. It was found that treated samples shows better drying characteristics than untreated samples. The objective of this study is to obtain drying kinetics which can be used to optimize the drying operations in grape drying.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Taguchi method is for the first time applied to optimize the synthesis of graphene films by copper-catalyzed decomposition of ethanol. In order to find the most appropriate experimental conditions for the realization of thin high-grade films, six experiments suitably designed and performed. The influence of temperature (1000–1070 °C) and synthesis duration (1–30 min) and hydrogen flow (0–100 sccm) on the number of graphene layers and defect density in the graphitic lattice was ranked by monitoring the intensity of the 2D- and D-bands relative to the G-band in the Raman spectra. After critical examination and adjusting of the conditions predicted to give optimal results, a continuous film consisting of 2–4 nearly defect-free graphene layers was obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flow induced shear stress plays an important role in regulating cell growth and distribution in scaffolds. This study sought to correlate wall shear stress and chondrocytes activity for engineering design of micro-porous osteochondral grafts based on the hypothesis that it is possible to capture and discriminate between the transmitted force and cell response at the inner irregularities. Unlike common tissue engineering therapies with perfusion bioreactors in which flow-mediated stress is the controlling parameter, this work assigned the associated stress as a function of porosity to influence in vitro proliferation of chondrocytes. D-optimality criterion was used to accommodate three pore characteristics for appraisal in a mixed level fractional design of experiment (DOE); namely, pore size (4 levels), distribution pattern (2 levels) and density (3 levels). Micro-porous scaffolds (n=12) were fabricated according to the DOE using rapid prototyping of an acrylic-based bio-photopolymer. Computational fluid dynamics (CFD) models were created correspondingly and used on an idealized boundary condition with a Newtonian fluid domain to simulate the dynamic microenvironment inside the pores. In vitro condition was reproduced for the 3D printed constructs seeded by high pellet densities of human chondrocytes and cultured for 72 hours. The results showed that cell proliferation was significantly different in the constructs (p<0.05). Inlet fluid velocity of 3×10-2mms-1 and average shear stress of 5.65×10-2 Pa corresponded with increased cell proliferation for scaffolds with smaller pores in hexagonal pattern and lower densities. Although the analytical solution of a Poiseuille flow inside the pores was found insufficient for the description of the flow profile probably due to the outside flow induced turbulence, it showed that the shear stress would increase with cell growth and decrease with pore size. This correlation demonstrated the basis for determining the relation between the induced stress and chondrocyte activity to optimize microfabrication of engineered cartilaginous constructs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to develop an effective methodology for implementing lean manufacturing strategies and a leanness evaluation metric using continuous performance measurement (CPM). Design/methodology/approach – Based on five lean principles, a systematic lean implementation methodology for manufacturing organizations has been proposed. A simplified leanness evaluation metric consisting of both efficiency and effectiveness attributes of manufacturing performance has been developed for continuous evaluation of lean implementation. A case study to validate the proposed methodology has been conducted and proposed CPM metric has been used to assess the manufacturing leanness. Findings – Proposed methodology is able to systematically identify manufacturing wastes, select appropriate lean tools, identify relevant performance indicators, achieve significant performance improvement and establish lean culture in the organization. Continuous performance measurement matrices in terms of efficiency and effectiveness are proved to be appropriate methods for continuous evaluation of lean performance. Research limitations/implications – Effectiveness of the method developed has been demonstrated by applying it in a real life assembly process. However, more tests/applications will be necessary to generalize the findings. Practical implications – Results show that applying the methods developed, managers can successfully identify and remove manufacturing wastes from their production processes. By improving process efficiency, they can optimize their resource allocations. Manufacturers now have a validated step by step methodology for successfully implementing lean strategies. Originality/value – According to the authors’ best knowledge, this is the first known study that proposed a systematic lean implementation methodology based on lean principles and continuous improvement techniques. Evaluation of performance improvement by lean strategies is a critical issue. This study develops a simplified leanness evaluation metric considering both efficiency and effectiveness attributes and integrates it with the lean implementation methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design of hydraulic turbines has often to deal with hydraulic instability. It is well-known that Francis and Kaplan types present hydraulic instability in their design power range. Even if modern CFD tools may help to define these dangerous operating conditions and optimize runner design, hydraulic instabilities may fortuitously arise during the turbine life and should be timely detected in order to assure a long-lasting operating life. In a previous paper, the authors have considered the phenomenon of helical vortex rope, which happens at low flow rates when a swirling flow, in the draft tube conical inlet, occupies a large portion of the inlet. In this condition, a strong helical vortex rope appears. The vortex rope causes mechanical effects on the runner, on the whole turbine and on the draft tube, which may eventually produce severe damages on the turbine unit and whose most evident symptoms are vibrations. The authors have already shown that vibration analysis is suitable for detecting vortex rope onset, thanks to an experimental test campaign performed during the commissioning of a 23 MW Kaplan hydraulic turbine unit. In this paper, the authors propose a sophisticated data driven approach to detect vortex rope onset at different power load, based on the analysis of the vibration signals in the order domain and introducing the so-called "residual order spectrogram", i.e. an order-rotation representation of the vibration signal. Some experimental test runs are presented and the possibility to detect instability onset, especially in real-time, is discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quality of experience (QoE) measures the overall perceived quality of mobile video delivery from subjective user experience and objective system performance. Current QoE computing models have two main limitations: 1) insufficient consideration of the factors influencing QoE, and; 2) limited studies on QoE models for acceptability prediction. In this paper, a set of novel acceptability-based QoE models, denoted as A-QoE, is proposed based on the results of comprehensive user studies on subjective quality acceptance assessments. The models are able to predict users’ acceptability and pleasantness in various mobile video usage scenarios. Statistical regression analysis has been used to build the models with a group of influencing factors as independent predictors, including encoding parameters and bitrate, video content characteristics, and mobile device display resolution. The performance of the proposed A-QoE models has been compared with three well-known objective Video Quality Assessment metrics: PSNR, SSIM and VQM. The proposed A-QoE models have high prediction accuracy and usage flexibility. Future user-centred mobile video delivery systems can benefit from applying the proposed QoE-based management to optimize video coding and quality delivery decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The immune system in the female reproductive tract (FRT) does not mount an attack against HIV or other sexually transmitted infections (STI) with a single endogenously produced microbicide or with a single arm of the immune system. Instead, the body deploys dozens of innate antimicrobials to the secretions of the female reproductive tract. Working together, these antimicrobials along with mucosal antibodies attack many different viral, bacterial and fungal targets. Within the FRT, the unique challenges of protection against sexually transmitted pathogens coupled with the need to sustain the development of an allogeneic fetus have evolved in such a way that sex hormones precisely regulate immune function to accomplish both tasks. The studies presented in this review demonstrate that estradiol and progesterone secreted during the menstrual cycle act both directly and indirectly on epithelial cells and other immune cells in the reproductive tract to modify immune function in a way that is unique to specific sites throughout the FRT. As presented in this review, studies from our laboratory and others demonstrate that the innate immune response is under hormonal control, varies with the stage of the menstrual cycle, and as such is suppressed at mid-cycle to optimize conditions for successful fertilization and pregnancy. In doing so, a window of STI vulnerability is created during which potential pathogens including HIV enter the reproductive tract to infect host targets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Eye care practitioners (ECPs) would tend to agree that wearing contact lenses increases the risk for infection, but millions of patients are still fitted with lenses every year because ECPs feel that the risk is manageable and that their patients' eye health can be protected. The Fusarium and Acanthamoeba keratitis outbreaks of years past were a wake-up call to manufacturers, ECPs, and regulatory agencies that risk cannot be managed without diligence, and that the complex relationship between contact lens materials, contact lens solutions, and compliance needs to be better understood in order to optimize the efficacy of contact lens care and improve care guidelines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work is to develop a demand-side-response model, which assists electricity consumers exposed to the market price to independently and proactively manage air-conditioning peak electricity demand. The main contribution of this research is to show how consumers can optimize the energy cost caused by the air conditioning load considering to several cases e.g. normal price, spike price, and the probability of a price spike case. This model also investigated how air-conditioning applies a pre-cooling method when there is a substantial risk of a price spike. The results indicate the potential of the scheme to achieve financial benefits for consumers and target the best economic performance for electrical generation distribution and transmission. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics regarding hot days from 2011 to 2012.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an efficient algorithm for multi-objective distribution feeder reconfiguration based on Modified Honey Bee Mating Optimization (MHBMO) approach. The main objective of the Distribution feeder reconfiguration (DFR) is to minimize the real power loss, deviation of the nodes’ voltage. Because of the fact that the objectives are different and no commensurable, it is difficult to solve the problem by conventional approaches that may optimize a single objective. So the metahuristic algorithm has been applied to this problem. This paper describes the full algorithm to Objective functions paid, The results of simulations on a 32 bus distribution system is given and shown high accuracy and optimize the proposed algorithm in power loss minimization.