895 resultados para Markov chains. Convergence. Evolutionary Strategy. Large Deviations
Resumo:
Clean and renewable energy generation and supply has drawn much attention worldwide in recent years, the proton exchange membrane (PEM) fuel cells and solar cells are among the most popular technologies. Accurately modeling the PEM fuel cells as well as solar cells is critical in their applications, and this involves the identification and optimization of model parameters. This is however challenging due to the highly nonlinear and complex nature of the models. In particular for PEM fuel cells, the model has to be optimized under different operation conditions, thus making the solution space extremely complex. In this paper, an improved and simplified teaching-learning based optimization algorithm (STLBO) is proposed to identify and optimize parameters for these two types of cell models. This is achieved by introducing an elite strategy to improve the quality of population and a local search is employed to further enhance the performance of the global best solution. To improve the diversity of the local search a chaotic map is also introduced. Compared with the basic TLBO, the structure of the proposed algorithm is much simplified and the searching ability is significantly enhanced. The performance of the proposed STLBO is firstly tested and verified on two low dimension decomposable problems and twelve large scale benchmark functions, then on the parameter identification of PEM fuel cell as well as solar cell models. Intensive experimental simulations show that the proposed STLBO exhibits excellent performance in terms of the accuracy and speed, in comparison with those reported in the literature.
Resumo:
This paper describes how urban agriculture differs from conventional agriculture not only in the way it engages with the technologies of growing, but also in the choice of crop and the way these are brought to market. The authors propose a new model for understanding these new relationships, which is analogous to a systems view of information technology, namely Hardware-Software- Interface.
The first component of the system is hardware. This is the technological component of the agricultural system. Technology is often thought of as equipment, but its linguistic roots are in ‘technis’ which means ‘know how’. Urban agriculture has to engage new technologies, ones that deal with the scale of operation and its context which is different than rural agriculture. Often the scale is very small, and soils are polluted. There this technology in agriculture could be technical such as aquaponic systems, or could be soil-based agriculture such as allotments, window-boxes, or permaculture. The choice of method does not necessarily determine the crop produced or its efficiency. This is linked to the biotic that is added to the hardware, which is seen as the ‘software’.
The software of the system are the ecological parts of the system. These produce the crop which may or may not be determined by the technology used. For example, a hydroponic system could produce a range of crops, or even fish or edible flowers. Software choice can be driven by ideological preferences such as permaculture, where companion planting is used to reduce disease and pests, or by economic factors such as the local market at a particular time of the year. The monetary value of the ‘software’ is determined by the market. Obviously small, locally produced crops are unlikely to compete against intensive products produced globally, however the value locally might be measured in different ways, and might be sold on a different market. This leads to the final part of the analogy - interface.
The interface is the link between the system and the consumer. In traditional agriculture, there is a tenuous link between the producer of asparagus in Peru and the consumer in Europe. In fact very little of the money spent by the consumer ever reaches the grower. Most of the money is spent on refrigeration, transport and profit for agents and supermarket chains. Local or hyper-local agriculture needs to bypass or circumvent these systems, and be connected more directly to the consumer. This is the interface. In hyper-localised systems effectiveness is often more important than efficiency, and direct links between producer and consumer create new economies.
Resumo:
Promoter hypermethylation is recognized as a hallmark of human cancer, in addition to conventional mechanisms of gene inactivation. As such, many new technologies have been developed over the past two decades to uncover novel targets of methylation and decipher complex epigenetic patterns. However, many of these are either labor intensive or provide limited data, confined to oligonucleotide hybridization sequences or enzyme cleavage sites and cannot be easily applied to screening large sets of sequences or samples. We present an application of denaturing high performance liquid chromatography (DHPLC), which relies on bisulfite modification of genomic DNA, for methylation screening. We validated DHPLC as a methylation screening tool using GSTP1, a well known target of methylation in prostate cancer. We developed an in silico approach to identify potential targets of promoter hypermethylation in prostate cancer. Using DHPLC, we screened two of these targets LGALS3 and SMAD4 for methylation. We show that DHPLC has an application as a fast, sensitive, quantitative and cost effective method for screening novel targets or DNA samples for DNA methylation.
Resumo:
Background The use of technology in healthcare settings is on the increase and may represent a cost-effective means of delivering rehabilitation. Reductions in treatment time, and delivery in the home, are also thought to be benefits of this approach. Children and adolescents with brain injury often experience deficits in memory and executive functioning that can negatively affect their school work, social lives, and future occupations. Effective interventions that can be delivered at home, without the need for high-cost clinical involvement, could provide a means to address a current lack of provision. We have systematically reviewed studies examining the effects of technology-based interventions for the rehabilitation of deficits in memory and executive functioning in children and adolescents with acquired brain injury. Objectives To assess the effects of technology-based interventions compared to placebo intervention, no treatment, or other types of intervention, on the executive functioning and memory of children and adolescents with acquired brain injury. Search methods We ran the search on the 30 September 2015. We searched the Cochrane Injuries Group Specialised Register, the Cochrane Central Register of Controlled Trials (CENTRAL), Ovid MEDLINE(R), Ovid MEDLINE(R) In-Process & Other Non-Indexed Citations, Ovid MEDLINE(R) Daily and Ovid OLDMEDLINE(R), EMBASE Classic + EMBASE (OvidSP), ISI Web of Science (SCI-EXPANDED, SSCI, CPCI-S, and CPSI-SSH), CINAHL Plus (EBSCO), two other databases, and clinical trials registers. We also searched the internet, screened reference lists, and contacted authors of included studies. Selection criteria Randomised controlled trials comparing the use of a technological aid for the rehabilitation of children and adolescents with memory or executive-functioning deficits with placebo, no treatment, or another intervention. Data collection and analysis Two review authors independently reviewed titles and abstracts identified by the search strategy. Following retrieval of full-text manuscripts, two review authors independently performed data extraction and assessed the risk of bias. Main results Four studies (involving 206 participants) met the inclusion criteria for this review. Three studies, involving 194 participants, assessed the effects of online interventions to target executive functioning (that is monitoring and changing behaviour, problem solving, planning, etc.). These studies, which were all conducted by the same research team, compared online interventions against a 'placebo' (participants were given internet resources on brain injury). The interventions were delivered in the family home with additional support or training, or both, from a psychologist or doctoral student. The fourth study investigated the use of a computer program to target memory in addition to components of executive functioning (that is attention, organisation, and problem solving). No information on the study setting was provided, however a speech-language pathologist, teacher, or occupational therapist accompanied participants. Two studies assessed adolescents and young adults with mild to severe traumatic brain injury (TBI), while the remaining two studies assessed children and adolescents with moderate to severe TBI. Risk of bias We assessed the risk of selection bias as low for three studies and unclear for one study. Allocation bias was high in two studies, unclear in one study, and low in one study. Only one study (n = 120) was able to conceal allocation from participants, therefore overall selection bias was assessed as high. One study took steps to conceal assessors from allocation (low risk of detection bias), while the other three did not do so (high risk of detection bias). Primary outcome 1: Executive functioning: Technology-based intervention versus placebo Results from meta-analysis of three studies (n = 194) comparing online interventions with a placebo for children and adolescents with TBI, favoured the intervention immediately post-treatment (standardised mean difference (SMD) -0.37, 95% confidence interval (CI) -0.66 to -0.09; P = 0.62; I2 = 0%). (As there is no 'gold standard' measure in the field, we have not translated the SMD back to any particular scale.) This result is thought to represent only a small to medium effect size (using Cohen’s rule of thumb, where 0.2 is a small effect, 0.5 a medium one, and 0.8 or above is a large effect); this is unlikely to have a clinically important effect on the participant. The fourth study (n = 12) reported differences between the intervention and control groups on problem solving (an important component of executive functioning). No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. The quality of evidence for this outcome according to GRADE was very low. This means future research is highly likely to change the estimate of effect. Primary outcome 2: Memory One small study (n = 12) reported a statistically significant difference in improvement in sentence recall between the intervention and control group following an eight-week remediation programme. No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. Secondary outcomes Two studies (n = 158) reported on anxiety/depression as measured by the Child Behavior Checklist (CBCL) and were included in a meta-analysis. We found no evidence of an effect with the intervention (mean difference -5.59, 95% CI -11.46 to 0.28; I2 = 53%). The GRADE quality of evidence for this outcome was very low, meaning future research is likely to change the estimate of effect. A single study sought to record adverse events and reported none. Two studies reported on use of the intervention (range 0 to 13 and 1 to 24 sessions). One study reported on social functioning/social competence and found no effect. The included studies reported no data for other secondary outcomes (that is quality of life and academic achievement). Authors' conclusions This review provides low-quality evidence for the use of technology-based interventions in the rehabilitation of executive functions and memory for children and adolescents with TBI. As all of the included studies contained relatively small numbers of participants (12 to 120), our findings should be interpreted with caution. The involvement of a clinician or therapist, rather than use of the technology, may have led to the success of these interventions. Future research should seek to replicate these findings with larger samples, in other regions, using ecologically valid outcome measures, and reduced clinician involvement.
Resumo:
Increasing litter size has long been a goal of pig breeders and producers, and may have implications for pig (Sus scrofa domesticus) welfare. This paper reviews the scientific evidence on biological factors affecting sow and piglet welfare in relation to large litter size. It is concluded that, in a number of ways, large litter size is a risk factor for decreased animal welfare in pig production. Increased litter size is associated with increased piglet mortality, which is likely to be associated with significant negative animal welfare impacts. In surviving piglets, many of the causes of mortality can also occur in non-lethal forms that cause suffering. Intense teat competition may increase the likelihood that some piglets do not gain adequate access to milk, causing starvation in the short term and possibly long-term detriments to health. Also, increased litter size leads to more piglets with low birth weight which is associated with a variety of negative long-term effects. Finally, increased production pressure placed on sows bearing large litters may produce health and welfare concerns for the sow. However, possible biological approaches to mitigating health and welfare issues associated with large litters are being implemented. An important mitigation strategy is genetic selection encompassing traits that promote piglet survival, vitality and growth. Sow nutrition and the minimisation of stress during gestation could also contribute to improving outcomes in terms of piglet welfare. Awareness of the possible negative welfare consequences of large litter size in pigs should lead to further active measures being taken to mitigate the mentioned effects. © 2013 Universities Federation for Animal Welfare.
Resumo:
Fraud in the global food supply chain is becoming increasingly common due to the huge profits associated with this type of criminal activity. Food commodities and ingredients that are expensive and are part of complex supply chains are particularly vulnerable. Both herbs and spices fit these criteria perfectly and yet strategies to detect fraudulent adulteration are still far from robust. An FT-IR screening method coupled to data analysis using chemometrics and a second method using LC-HRMS were developed, with the latter detecting commonly used adulterants by biomarker identification. The two tier testing strategy was applied to 78 samples obtained from a variety of retail and on-line sources. There was 100% agreement between the two tests that over 24% of all samples tested had some form of adulterants present. The innovative strategy devised could potentially be used for testing the global supply chains for fraud in many different forms of herbs.
Resumo:
The high dependence of Portugal from foreign energy sources (mainly fossil fuels), together with the international commitments assumed by Portugal and the national strategy in terms of energy policy, as well as resources sustainability and climate change issues, inevitably force Portugal to invest in its energetic self-sufficiency. The 20/20/20 Strategy defined by the European Union defines that in 2020 60% of the total electricity consumption must come from renewable energy sources. Wind energy is currently a major source of electricity generation in Portugal, producing about 23% of the national total electricity consumption in 2013. The National Energy Strategy 2020 (ENE2020), which aims to ensure the national compliance of the European Strategy 20/20/20, states that about half of this 60% target will be provided by wind energy. This work aims to implement and optimise a numerical weather prediction model in the simulation and modelling of the wind energy resource in Portugal, both in offshore and onshore areas. The numerical model optimisation consisted in the determination of which initial and boundary conditions and planetary boundary layer physical parameterizations options provide wind power flux (or energy density), wind speed and direction simulations closest to in situ measured wind data. Specifically for offshore areas, it is also intended to evaluate if the numerical model, once optimised, is able to produce power flux, wind speed and direction simulations more consistent with in situ measured data than wind measurements collected by satellites. This work also aims to study and analyse possible impacts that anthropogenic climate changes may have on the future wind energetic resource in Europe. The results show that the ECMWF reanalysis ERA-Interim are those that, among all the forcing databases currently available to drive numerical weather prediction models, allow wind power flux, wind speed and direction simulations more consistent with in situ wind measurements. It was also found that the Pleim-Xiu and ACM2 planetary boundary layer parameterizations are the ones that showed the best performance in terms of wind power flux, wind speed and direction simulations. This model optimisation allowed a significant reduction of the wind power flux, wind speed and direction simulations errors and, specifically for offshore areas, wind power flux, wind speed and direction simulations more consistent with in situ wind measurements than data obtained from satellites, which is a very valuable and interesting achievement. This work also revealed that future anthropogenic climate changes can negatively impact future European wind energy resource, due to tendencies towards a reduction in future wind speeds especially by the end of the current century and under stronger radiative forcing conditions.
Resumo:
During the late twentieth century the supply chains for gold were considered by the majority of consumers (when they were considered at all) to be driven by simple commercial imperatives. That notion was shattered during the first decade of the twenty-first century by the appearance of ethical campaigns, led by advocates determined to present major players in the gold industry as morally reprehensible. The ‘No Dirty Gold’ campaign sought to shift the purchasing of gold onto a moral register, in order to challenge the activities of large mining corporations. It was followed by the Fairtrade Foundation’s ‘Fairtrade Gold’ initiative, which had aspirations to support subsistence mining communities at the expense of big business. By directly targeting a luxury material and playing on its inherent social ambiguities, campaigners hoped to thoroughly moralise the purchasing of gold objects. Dr Oakley’s presentation will examine the forces behind this developing social phenomenon, describe the trajectories of a selection of major campaigns, and consider the extent to which these have impacted on public attitudes, gold miners and the actions of consumers, producers and retailers of luxury goods.
Resumo:
PURPOSE: To examine risk-taking and risk-perception associations with perceived exertion, pacing and performance in athletes. METHODS: Two experiments were conducted in which risk-perception was assessed using the domain-specific risk-taking (DOSPERT) scale in 20 novice cyclists (Experiment 1) and 32 experienced ultra-marathon runners (Experiment 2). In Experiment 1, participants predicted their pace and then performed a 5 km maximum effort cycling time-trial on a calibrated KingCycle mounted bicycle. Split-times and perceived exertion were recorded every kilometer. In experiment 2, each participant predicted their split times before running a 100 km ultra-marathon. Split-times and perceived exertion were recorded at 7 check-points. In both experiments, higher and lower risk-perception groups were created using median split of DOSPERT scores. RESULTS: In experiment 1, pace during the first km was faster among lower compared to higher risk-perceivers, t(18)=2.0 P=0.03, and faster among higher compared lower risk-takers, t(18)=2.2 P=0.02. Actual pace was slower than predicted pace during the first km in both the higher risk perceivers, t(9)=-4.2 P=0.001, and lower risk-perceivers, t(9)=-1.8 P=0.049. In experiment 2, pace during the first 36 km was faster among lower compared to higher risk-perceivers, t(16)=2.0 P=0.03. Irrespective of risk-perception group, actual pace was slower than predicted pace during the first 18 km, t(16)=8.9 P<0.001, and from 18 to 36 km, t(16)=4.0 P<0.001. In both experiments there was no difference in performance between higher and lower risk-perception groups. CONCLUSIONS: Initial pace is associated with an individual's perception of risk, with low perceptions of risk being associated with a faster starting pace. Large differences between predicted and actual pace suggests the performance template lacks accuracy, perhaps indicating greater reliance on momentary pacing decisions rather than pre-planned strategy.
Resumo:
Sakr challenges the notion that transnational media technologies have forced states in the Arab Middle East to cede ever more control to non-state players since the 1990s. Taking account of a long history of foreign political engineering in Arab countries, she probes the realities of Arab broadcasting privatization, intra-regional harmonization of government communication policies, and external financial support for media freedom and reform, to show how Arab governments were large successfully in harnessing forces implicated in media globalization in a way that entrenched authoritarian elements of the status quo. The findings validate an alternative to globalization theory that places a dual focus on the agency of national ruling elites and the international structures that underpin the power of those elites today, as in the past.
Resumo:
Partner selection is crucial to green supply chain management as the focal firm is responsible for the environmental performance of the whole supply chain. The construction of appropriate selection criteria is an essential, but often neglected pre-requisite in the partner selection process. This paper proposes a three-stage model that combines Dempster-Shafer belief acceptability theory and particle swarm optimization technique for the first time in this application. This enables optimization of both effectiveness, in its consideration of the inter-dependence of a broad range of quantitative and qualitative selection criteria, and efficiency in its use of scarce resources during the criteria construction process to be achieved simultaneously. This also enables both operational and strategic attributes can be selected at different levels of hierarchy criteria in different decision-making environments. The practical efficacy of the model is demonstrated by an application in Company ABC, a large Chinese electronic equipment and instrument manufacturer.
Resumo:
This research examines media integration in China, choosing two Chinese newspaper groups as cases for comparative study. The study analyses the convergence strategies of these Chinese groups by reference to an Role Model of convergence developed from a literature review of studies of cases of media convergence in the UK – in particular the Guardian (GNM), Telegraph Media Group (TMG), the Daily Mail and the Times. UK cases serve to establish the characteristics, causes and consequences of different forms of convergence and formulate a model of convergence. The model will specify the levels of newsroom convergence and the sub-units of analysis which will be used to collect empirical data from Chinese News Organisations and compare their strategies, practices and results with the UK experience. The literature review shows that there is a need for more comparative studies of media convergence strategy in general, and particularly in relation to Chinese media. Therefore, the study will address a gap in the understanding of media convergence in China. For this reason, my innovations have three folds: Firstly, to develop a new and comprehensive model of media convergence and a detailed understanding of the reasons why media companies pursue differing strategies in managing convergence across a wide range of units of analysis. Secondly, this study tries to compare the multimedia strategies of media groups under radically different political systems. Since, there is no standard research method or systematic theoretical framework for the study of Newsroom Convergence, this study develops an integrated perspective. The research will use the triangulation analysis of textual, field observation and interviews to explain systematically what was the newsroom structure like in the past and how did the copy flow change and why. Finally, this case study of media groups can provide an industrial model or framework for the other media groups.
Resumo:
This paper proposes a computationally efficient methodology for the optimal location and sizing of static and switched shunt capacitors in large distribution systems. The problem is formulated as the maximization of the savings produced by the reduction in energy losses and the avoided costs due to investment deferral in the expansion of the network. The proposed method selects the nodes to be compensated, as well as the optimal capacitor ratings and their operational characteristics, i.e. fixed or switched. After an appropriate linearization, the optimization problem was formulated as a large-scale mixed-integer linear problem, suitable for being solved by means of a widespread commercial package. Results of the proposed optimizing method are compared with another recent methodology reported in the literature using two test cases: a 15-bus and a 33-bus distribution network. For the both cases tested, the proposed methodology delivers better solutions indicated by higher loss savings, which are achieved with lower amounts of capacitive compensation. The proposed method has also been applied for compensating to an actual large distribution network served by AES-Venezuela in the metropolitan area of Caracas. A convergence time of about 4 seconds after 22298 iterations demonstrates the ability of the proposed methodology for efficiently handling large-scale compensation problems.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simulator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM provides several dynamic strategies for agents’ behavior. This paper presents a method that aims to provide market players with strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses a reinforcement learning algorithm to learn from experience how to choose the best from a set of possible bids. These bids are defined accordingly to the cost function that each producer presents.
Resumo:
A good verification strategy should bring near the simulation and real functioning environments. In this paper we describe a system-level co-verification strategy that uses a common flow for functional simulation, timing simulation and functional debug. This last step requires using a BST infrastructure, now widely available on commercial devices, specially on FPGAs with medium/large pin-counts.