967 resultados para Optimal frame-level timing estimator
Resumo:
Global efforts to mitigate climate change are guided by projections of future temperatures1. But the eventual equilibrium global mean temperature associated with a given stabilization level of atmospheric greenhouse gas concentrations remains uncertain1, 2, 3, complicating the setting of stabilization targets to avoid potentially dangerous levels of global warming4, 5, 6, 7, 8. Similar problems apply to the carbon cycle: observations currently provide only a weak constraint on the response to future emissions9, 10, 11. Here we use ensemble simulations of simple climate-carbon-cycle models constrained by observations and projections from more comprehensive models to simulate the temperature response to a broad range of carbon dioxide emission pathways. We find that the peak warming caused by a given cumulative carbon dioxide emission is better constrained than the warming response to a stabilization scenario. Furthermore, the relationship between cumulative emissions and peak warming is remarkably insensitive to the emission pathway (timing of emissions or peak emission rate). Hence policy targets based on limiting cumulative emissions of carbon dioxide are likely to be more robust to scientific uncertainty than emission-rate or concentration targets. Total anthropogenic emissions of one trillion tonnes of carbon (3.67 trillion tonnes of CO2), about half of which has already been emitted since industrialization began, results in a most likely peak carbon-dioxide-induced warming of 2 °C above pre-industrial temperatures, with a 5–95% confidence interval of 1.3–3.9 °C.
Resumo:
The planning of semi-autonomous vehicles in traffic scenarios is a relatively new problem that contributes towards the goal of making road travel by vehicles free of human drivers. An algorithm needs to ensure optimal real time planning of multiple vehicles (moving in either direction along a road), in the presence of a complex obstacle network. Unlike other approaches, here we assume that speed lanes are not present and that different lanes do not need to be maintained for inbound and outbound traffic. Our basic hypothesis is to carry forward the planning task to ensure that a sufficient distance is maintained by each vehicle from all other vehicles, obstacles and road boundaries. We present here a 4-layer planning algorithm that consists of road selection (for selecting the individual roads of traversal to reach the goal), pathway selection (a strategy to avoid and/or overtake obstacles, road diversions and other blockages), pathway distribution (to select the position of a vehicle at every instance of time in a pathway), and trajectory generation (for generating a curve, smooth enough, to allow for the maximum possible speed). Cooperation between vehicles is handled separately at the different levels, the aim being to maximize the separation between vehicles. Simulated results exhibit behaviours of smooth, efficient and safe driving of vehicles in multiple scenarios; along with typical vehicle behaviours including following and overtaking.
Resumo:
In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.
Resumo:
This article elucidates the Typological Primacy Model (TPM; Rothman, 2010, 2011, 2013) for the initial stages of adult third language (L3) morphosyntactic transfer, addressing questions that stem from the model and its application. The TPM maintains that structural proximity between the L3 and the L1 and/or the L2 determines L3 transfer. In addition to demonstrating empirical support for the TPM, this article articulates a proposal for how the mind unconsciously determines typological (structural) proximity based on linguistic cues from the L3 input stream used by the parser early on to determine holistic transfer of one previous (the L1 or the L2) system. This articulated version of the TPM is motivated by argumentation appealing to cognitive and linguistic factors. Finally, in line with the general tenets of the TPM, I ponder if and why L3 transfer might obtain differently depending on the type of bilingual (e.g. early vs. late) and proficiency level of bilingualism involved in the L3 process.
Resumo:
Motivated by the motion planning problem for oriented vehicles travelling in a 3-Dimensional space; Euclidean space E3, the sphere S3 and Hyperboloid H3. For such problems the orientation of the vehicle is naturally represented by an orthonormal frame over a point in the underlying manifold. The orthonormal frame bundles of the space forms R3,S3 and H3 correspond with their isometry groups and are the Euclidean group of motion SE(3), the rotation group SO(4) and the Lorentzian group SO(1; 3) respectively. Orthonormal frame bundles of space forms coincide with their isometry groups and therefore the focus shifts to left-invariant control systems defined on Lie groups. In this paper a method for integrating these systems is given where the controls are time-independent. For constant twist motions or helical motions, the corresponding curves g(t) 2 SE(3) are given in closed form by using the well known Rodrigues’ formula. However, this formula is only applicable to the Euclidean case. This paper gives a method for computing the non-Euclidean screw/helical motions in closed form. This involves decoupling the system into two lower dimensional systems using the double cover properties of Lie groups, then the lower dimensional systems are solved explicitly in closed form.
Resumo:
This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier.
Resumo:
This article elucidates the Typological Primacy Model (TPM; Rothman, 2010, 2011, 2013) for the initial stages of adult third language (L3) morphosyntactic transfer, addressing questions that stem from the model and its application. The TPM maintains that structural proximity between the L3 and the L1 and/or the L2 determines L3 transfer. In addition to demonstrating empirical support for the TPM, this article articulates a proposal for how the mind unconsciously determines typological (structural) proximity based on linguistic cues from the L3 input stream used by the parser early on to determine holistic transfer of one previous (the L1 or the L2) system. This articulated version of the TPM is motivated by argumentation appealing to cognitive and linguistic factors. Finally, in line with the general tenets of the TPM, I ponder if and why L3 transfer might obtain differently depending on the type of bilingual (e.g. early vs. late) and proficiency level of bilingualism involved in the L3 process.
Resumo:
An equation of Monge-Ampère type has, for the first time, been solved numerically on the surface of the sphere in order to generate optimally transported (OT) meshes, equidistributed with respect to a monitor function. Optimal transport generates meshes that keep the same connectivity as the original mesh, making them suitable for r-adaptive simulations, in which the equations of motion can be solved in a moving frame of reference in order to avoid mapping the solution between old and new meshes and to avoid load balancing problems on parallel computers. The semi-implicit solution of the Monge-Ampère type equation involves a new linearisation of the Hessian term, and exponential maps are used to map from old to new meshes on the sphere. The determinant of the Hessian is evaluated as the change in volume between old and new mesh cells, rather than using numerical approximations to the gradients. OT meshes are generated to compare with centroidal Voronoi tesselations on the sphere and are found to have advantages and disadvantages; OT equidistribution is more accurate, the number of iterations to convergence is independent of the mesh size, face skewness is reduced and the connectivity does not change. However anisotropy is higher and the OT meshes are non-orthogonal. It is shown that optimal transport on the sphere leads to meshes that do not tangle. However, tangling can be introduced by numerical errors in calculating the gradient of the mesh potential. Methods for alleviating this problem are explored. Finally, OT meshes are generated using observed precipitation as a monitor function, in order to demonstrate the potential power of the technique.
Resumo:
The paper explores pollination from a multi level policy perspective and analyses the institutional fit and inter play of multi-faceted pollination-related policies. First, it asks what the major policies are that frame pollination at the EU level. Second, it explores the relationship between the EU policies and localised ways of understanding pollination. Addressed third is how the concept of ecosystem services can aid in under- standing the various ways of framing and governing the situation. The results show that the policy systems affecting pollination are abundant and that these systems create different kinds of pressure on stakeholders, at several levels of society. The local-level concerns are more about the loss of pollination services than about loss of pollinators. This points to the problem of fit between local activity driven by economic reasoning and biodiversity-driven EU policies. Here we see the concept of ecosystem services having some potential, since its operationalisation can combine economic and environmental considerations. Further- more, the analysis shows how, instead of formal institutions, it seems that social norms, habits, and motivation are the key to understanding and developing effective and attractive governance measures.
Resumo:
Many generalist populations may actually be composed of relatively specialist individuals. This `individual specialization` may have important ecological and evolutionary implications. Although this phenomenon has been documented in more than one hundred taxa, it is still unclear how individuals within a population actually partition resources. Here we applied several methods based on network theory to investigate the intrapopulation patterns of resource use in the gracile mouse opossum Gracilinanus microtarsus. We found evidence of significant individual specialization in this species and that the diets of specialists are nested within the diets of generalists. This novel pattern is consistent with a recently proposed model of optimal foraging and implies strong asymmetry in the interactions among individuals of a population.
Resumo:
J.A. Ferreira Neto, E.C. Santos Junior, U. Fra Paleo, D. Miranda Barros, and M.C.O. Moreira. 2011. Optimal subdivision of land in agrarian reform projects: an analysis using genetic algorithms. Cien. Inv. Agr. 38(2): 169-178. The objective of this manuscript is to develop a new procedure to achieve optimal land subdivision using genetic algorithms (GA). The genetic algorithm was tested in the rural settlement of Veredas, located in Minas Gerais, Brazil. This implementation was based on the land aptitude and its productivity index. The sequence of tests in the study was carried out in two areas with eight different agricultural aptitude classes, including one area of 391.88 ha subdivided into 12 lots and another of 404.1763 ha subdivided into 14 lots. The effectiveness of the method was measured using the shunting line standard value of a parceled area lot`s productivity index. To evaluate each parameter, a sequence of 15 calculations was performed to record the best individual fitness average (MMI) found for each parameter variation. The best parameter combination found in testing and used to generate the new parceling with the GA was the following: 320 as the generation number, a population of 40 individuals, 0.8 mutation tax, and a 0.3 renewal tax. The solution generated rather homogeneous lots in terms of productive capacity.
Resumo:
Studies have shown that the increase of cell metabolism depends on the low level laser therapy (LLLT) parameters used to irradiate the cells. However, the optimal laser dose to up-regulate pulp cell activity remains unknown. Consequently, the aim of this study was to evaluate the metabolic response of odontoblast-like cells (MDPC-23) exposed to different LLLT doses. Cells at 20000 cells/cm(2) were seeded in 24-well plates using plain culture medium (DMEM) and were incubated in a humidified incubator with 5% CO(2) at 37 degrees C. After 24 h, the culture medium was replaced by fresh DMEM supplemented with 5% (stress by nutritional deficit) or 10% fetal bovine serum (FBS). The cells were exposed to different laser doses from a near infrared diode laser prototype designed to provide a uniform irradiation of the wells. The experimental groups were: G1: 1.5 J/cm(2) + 5% FBS; G2: 1.5 J/cm(2) + 10% FBS; G3: 5 J/cm(2) + 5% FBS; G4: 5 J/cm(2) + 10% FBS; G5: 19 J/cm(2) + 5% FBS; G6: 19 J/cm(2) + 10% FBS. LLLT was performed in 3 consecutive irradiation cycles with a 24-hour interval. Non-irradiated cells cultured in DMEM supplemented with either 5 or 10% FBS served as control groups. The analysis of the metabolic response was performed by the MTT assay 3 h after the last irradiation. G1 presented an increase in SDH enzyme activity and differed significantly (Mann-Whitney test, p < 0.05) from the other groups. Analysis by scanning electron microscopy showed normal cell morphology in all groups. Under the tested conditions, LLLT stimulated the metabolic activity of MDPC-23 cultured in DMEM supplemented with 5% FBS and exposed to a laser dose of 1.5 J/cm(2). These findings are relevant for further studies on the action of near infrared lasers on cells with odontoblast phenotype.
Resumo:
BACKGROUND AND OBJECTIVE: To a large extent, people who have suffered a stroke report unmet needs for rehabilitation. The purpose of this study was to explore aspects of rehabilitation provision that potentially contribute to self-reported met needs for rehabilitation 12 months after stroke with consideration also to severity of stroke. METHODS: The participants (n = 173) received care at the stroke units at the Karolinska University Hospital, Sweden. Using a questionnaire, the dependent variable, self-reported met needs for rehabilitation, was collected at 12 months after stroke. The independent variables were four aspects of rehabilitation provision based on data retrieved from registers and structured according to four aspects: amount of rehabilitation, service level (day care rehabilitation, primary care rehabilitation and home-based rehabilitation), operator level (physiotherapist, occupational therapist, speech therapist) and time after stroke onset. Multivariate logistic regression analyses regarding the aspects of rehabilitation were performed for the participants who were divided into three groups based on stroke severity at onset. RESULTS: Participants with moderate/severe stroke who had seen a physiotherapist at least once during each of the 1st, 2nd and 3rd-4th quarters of the first year (OR 8.36, CI 1.40-49.88 P = 0.020) were more likely to report met rehabilitation needs. CONCLUSION: For people with moderate/severe stroke, continuity in rehabilitation (preferably physiotherapy) during the first year after stroke seems to be associated with self-reported met needs for rehabilitation.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
A inconsistência entre a teoria e o comportamento empírico dos agentes no que tange ao mercado privado de pensões tem se mostrado um dos mais resistentes puzzles presentes na literatura econômica. Em modelos de otimização intertemporal de consumo e poupança sob incerteza em relação ao tempo de vida dos agentes, anuidades são ativos dominantes, anulando ou restringindo fortemente a demanda por ativos cujos retornos não estão relacionados à probabilidade de sobrevivência. Na prática, entretanto, consumidores são extremamente céticos em relação às anuidades. Em oposição ao seguro contra longevidade oferecido pelas anuidades, direitos sobre esses ativos - essencialmente ilíquidos - cessam no caso de morte do titular. Nesse sentido, choques não seguráveis de liquidez e a presença de bequest motives foram consideravelmente explorados como possíveis determinantes da baixa demanda verificada. Apesar dos esforços, o puzzle persiste. Este trabalho amplia a dominância teórica das anuidades sobre ativos não contingentes em mercados incompletos; total na ausência de bequest motives, e parcial, quando os agentes se preocupam com possíveis herdeiros. Em linha com a literatura, simulações numéricas atestam que uma parcela considerável do portfolio ótimo dos agentes seria constituída de anuidades mesmo diante de choques de liquidez, bequest motives, e preços não atuarialmente justos. Em relação a um aspecto relativamente negligenciado pela academia, mostramos que o tempo ótimo de conversão de poupança em anuidades está diretamente relacionado à curva salarial dos agentes. Finalmente, indicamos que, caso as preferências dos agentes sejam tais que o nível de consumo ótimo decaia com a idade, a demanda por anuidades torna-se bastante sensível ao sobrepreço (em relação àquele atuarialmente justo) praticado pela indústria, chegando a níveis bem mais compatíveis com a realidade empírica.