874 resultados para New Iterative Method
Resumo:
Cooling and sinking of dense saline water in the Norwegian–Greenland Sea is essential for the formation of North Atlantic Deep Water. The convection in the Norwegian–Greenland Sea allows for a northward flow of warm surface water and southward transport of cold saline water. This circulation system is highly sensitive to climate change and has been shown to operate in different modes. In ice cores the last glacial period is characterized by millennial-scale Dansgaard–Oeschger (D–O) events of warm interstadials and cold stadials. Similar millennial-scale variability (linked to D–O events) is evident from oceanic cores, suggesting a strong coupling of the atmospheric and oceanic circulations system. Particularly long-lasting cold stadials correlate with North Atlantic Heinrich events, where icebergs released from the continents caused a spread of meltwater over the northern North Atlantic and Nordic seas. The meltwater layer is believed to have caused a stop or near-stop in the deep convection, leading to cold climate. The spreading of meltwater and changes in oceanic circulation have a large influence on the carbon exchange between atmosphere and the deep ocean and lead to profound changes in the 14C activity of the surface ocean. Here we demonstrate marine 14C reservoir ages (R) of up to c. 2000 years for Heinrich event H4. Our R estimates are based on a new method for age model construction using identified tephra layers and tie-points based on abrupt interstadial warmings.
Resumo:
The cycle of the academic year impacts on efforts to refine and improve major group design-build-test (DBT) projects since the time to run and evaluate projects is generally a full calendar year. By definition these major projects have a high degree of complexity since they act as the vehicle for the application of a range of technical knowledge and skills. There is also often an extensive list of desired learning outcomes which extends to include professional skills and attributes such as communication and team working. It is contended that student project definition and operation, like any other designed product, requires a number of iterations to achieve optimisation. The problem however is that if this cycle takes four or more years then by the time a project’s operational structure is fine tuned it is quite possible that the project theme is no longer relevant. The majority of the students will also inevitably experience a sub-optimal project experience over the 5 year development period. It would be much better if the ratio were flipped so that in 1 year an optimised project definition could be achieved which had sufficient longevity that it could run in the same efficient manner for 4 further years. An increased number of parallel investigators would also enable more varied and adventurous project concepts to be examined than a single institution could undertake alone in the same time frame.
This work-in-progress paper describes a parallel processing methodology for the accelerated definition of new student DBT project concepts. This methodology has been devised and implemented by a number of CDIO partner institutions in the UK & Ireland region. An agreed project theme was operated in parallel in one academic year with the objective of replacing a multi-year iterative cycle. Additionally the close collaboration and peer learning derived from the interaction between the coordinating academics facilitated the development of faculty teaching skills in line with CDIO standard 10.
Resumo:
Abstract image.
A new method for ketone enolate C-acylation is described which utilizes alkyl pentafluorophenylcarbonates, thiocarbonates and thionocarbonates as the reactive acylating agents, and MgBr2.Et2O, DMAP and i-Pr2NEt as the reagents for enolization. A wide range of ketones have been observed to undergo clean C-acylation via this protocol.
Resumo:
A new method is presented for transmission loss allocation based on the separation of transmission loss caused by load and the loss due to circulating currents between generators. The theoretical basis for and derivation of the loss formulae are presented using simple systems. The concept is then extended to a general power system using the Ybus model. Details of the application of the proposed method to a typical power system are presented along with results from the IEEE 30 bus test system. The results from both the small system and the standard IEEE test system demonstrate the validity of the proposed method.
Resumo:
Best concrete research paper by a student - Research has shown that the cost of managing structures puts high strain on the infrastructure budget, with
estimates of over 50% of the European construction budget being dedicated to repair and maintenance. If reinforced concrete
structures are not suitably designed and adequately maintained, their service life is compromised, resulting in the full economic
value of the investment not realised. The issue is more prevalent in coastal structures as a result of combinations of aggressive
actions, such as those caused by chlorides, sulphates and cyclic freezing and thawing.
It is a common practice nowadays to ensure durability of reinforced concrete structures by specifying a concrete mix and a
nominal cover at the design stage to cater for the exposure environment. This in theory should produce the performance required
to achieve a specified service life. Although the European Standard EN 206-1 specifies variations in the exposure environment,
it does not take into account the macro and micro climates surrounding structures, which have a significant influence on their
performance and service life. Therefore, in order to construct structures which will perform satisfactorily in different exposure
environments, the following two aspects need to be developed: a performance based specification to supplement EN 206-1
which will outline the expected performance of the structure in a given environment; and a simple yet transferrable procedure
for assessing the performance of structures in service termed KPI Theory. This will allow the asset managers not only to design
structures for the intended service life, but also to take informed maintenance decisions should the performance in service fall
short of what was specified. This paper aims to discuss this further.
Resumo:
Physicians expect a treatment to be more effective when its clinical outcomes are described as relative rather than as absolute risk reductions. We examined whether effects of presentation method (relative vs. absolute risk reduction)
remain when physicians are provided the baseline risk information, a vital piece of statistical information omitted in previous studies. Using a between-subjects design, ninety five physicians were presented the risk reduction associated
with a fictitious treatment for hypertension either as an absolute risk reduction or as a relative risk reduction, with or without including baseline risk information. Physicians reported that the treatment would be more effective and that they would be more willing to prescribe it when its risk reduction was presented to them in relative rather than in absolute terms. The relative risk reduction was perceived as more effective than absolute risk reduction even when the baseline risk information was explicitly reported. We recommend that information about absolute risk reduction be made available to physicians in the reporting of clinical outcomes. Moreover, health professionals should be cognizant of the potential biasing effects of risk information presented in relative risk terms
Resumo:
Mobile malware has been growing in scale and complexity as smartphone usage continues to rise. Android has surpassed other mobile platforms as the most popular whilst also witnessing a dramatic increase in malware targeting the platform. A worrying trend that is emerging is the increasing sophistication of Android malware to evade detection by traditional signature-based scanners. As such, Android app marketplaces remain at risk of hosting malicious apps that could evade detection before being downloaded by unsuspecting users. Hence, in this paper we present an effective approach to alleviate this problem based on Bayesian classification models obtained from static code analysis. The models are built from a collection of code and app characteristics that provide indicators of potential malicious activities. The models are evaluated with real malware samples in the wild and results of experiments are presented to demonstrate the effectiveness of the proposed approach.
Resumo:
This paper presents a new approach to speech enhancement from single-channel measurements involving both noise and channel distortion (i.e., convolutional noise), and demonstrates its applications for robust speech recognition and for improving noisy speech quality. The approach is based on finding longest matching segments (LMS) from a corpus of clean, wideband speech. The approach adds three novel developments to our previous LMS research. First, we address the problem of channel distortion as well as additive noise. Second, we present an improved method for modeling noise for speech estimation. Third, we present an iterative algorithm which updates the noise and channel estimates of the corpus data model. In experiments using speech recognition as a test with the Aurora 4 database, the use of our enhancement approach as a preprocessor for feature extraction significantly improved the performance of a baseline recognition system. In another comparison against conventional enhancement algorithms, both the PESQ and the segmental SNR ratings of the LMS algorithm were superior to the other methods for noisy speech enhancement.
Resumo:
A new battery modelling method is presented based on the simulation error minimization criterion rather than the conventional prediction error criterion. A new integrated optimization method to optimize the model parameters is proposed. This new method is validated on a set of Li ion battery test data, and the results confirm the advantages of the proposed method in terms of the model generalization performance and long-term prediction accuracy.
Resumo:
Tese de mestrado. Biologia (Biologia Molecular e Genética). Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
This paper presents a complete, quadratic programming formulation of the standard thermal unit commitment problem in power generation planning, together with a novel iterative optimisation algorithm for its solution. The algorithm, based on a mixed-integer formulation of the problem, considers piecewise linear approximations of the quadratic fuel cost function that are dynamically updated in an iterative way, converging to the optimum; this avoids the requirement of resorting to quadratic programming, making the solution process much quicker. From extensive computational tests on a broad set of benchmark instances of this problem, the algorithm was found to be flexible and capable of easily incorporating different problem constraints. Indeed, it is able to tackle ramp constraints, which although very important in practice were rarely considered in previous publications. Most importantly, optimal solutions were obtained for several well-known benchmark instances, including instances of practical relevance, that are not yet known to have been solved to optimality. Computational experiments and their results showed that the method proposed is both simple and extremely effective.