63 resultados para Method of moments algorithm
Resumo:
This article presents a reinterpretation of James Harrington's writings. It takes issue with J. G. A. Pocock's reading, which treats him as importing into England a Machiavellian ‘language of political thought’. This reading is the basis of Pocock's stress on the republicanism of eighteenth-century opposition values. Harrington's writings were in fact a most implausible channel for such ideas. His outlook owed much to Stoicism. Unlike the Florentine, he admired the contemplative life; was sympathetic to commerce; and was relaxed about the threat of ‘corruption’ (a concept that he did not understand). These views can be associated with his apparent aims: the preservation of a national church with a salaried but politically impotent clergy; and the restoration of the royalist gentry to a leading role in English politics. Pocock's hypothesis is shown to be conditioned by his method; its weaknesses reflect some difficulties inherent in the notion of ‘languages of thought’.
Resumo:
Particulate antigen assemblies in the nanometer range and DNA plasmids are particularly interesting for designing vaccines. We hypothesised that a combination of these approaches could result in a new delivery method of gp160 envelope HIV-1 vaccine which could combine the potency of virus-like particles (VLPs) and the simplicity of use of DNA vaccines. Characterisation of lentivirus-like particles (lentiVLPs) by western blot, dynamic light scattering and electron microscopy revealed that their protein pattern, size and structure make them promising candidates for HIV-1 vaccines. Although all particles were similar with regard to size and distribution, they clearly differed in p24 capsid protein content suggesting that Rev may be required for particle maturation and Gag processing. In vivo, lentiVLP pseudotyping with the gp160 envelope or with a combination of gp160 and VSV-G envelopes did not influence the magnitude of the immune response but the combination of lentiVLPs with Alum adjuvant resulted in a more potent response. Interestingly, the strongest immune response was obtained when plasmids encoding lentiVLPs were co-delivered to mice muscles by electrotransfer, suggesting that lentiVLPs were efficiently produced in vivo or the packaging genes mediate an adjuvant effect. DNA electrotransfer of plasmids encoding lentivirus-like particles offers many advantages and appears therefore as a promising delivery method of HIV-1 vaccines. Keywords:VLP, Electroporation, Electrotransfer, HIV vaccine, DNA vaccine
Resumo:
We present a methodology that allows a sea ice rheology, suitable for use in a General Circulation Model (GCM), to be determined from laboratory and tank experiments on sea ice when combined with a kinematic model of deformation. The laboratory experiments determine a material rheology for sea ice, and would investigate a nonlinear friction law of the form τ ∝ σ n⅔, instead of the more familiar Amonton's law, τ = μσn (τ is the shear stress, μ is the coefficient of friction and σ n is the normal stress). The modelling approach considers a representative region R containing ice floes (or floe aggregates), separated by flaws. The deformation of R is imposed and the motion of the floes determined using a kinematic model, which will be motivated from SAR observations. Deformation of the flaws is inferred from the floe motion and stress determined from the material rheology. The stress over R is then determined from the area-weighted contribution from flaws and floes
Resumo:
Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.
Resumo:
We report for the first time a detailed procedure for creating a simulation model of energetically stable, folded graphene-like pores and simulation results of CO2/CH4 and CO2/N2 separation using these structures. We show that folding of graphene structures is a very promising method to improve the separation of CO2 from mixtures with CH4 and N2. The separation properties of the analyzed materials are compared with carbon nanotubes having similar diameters or S/V ratio. The presented results have potential importance in the field of CO2 capture and sequestration.
Resumo:
This article is concerned with the liability of search engines for algorithmically produced search suggestions, such as through Google’s ‘autocomplete’ function. Liability in this context may arise when automatically generated associations have an offensive or defamatory meaning, or may even induce infringement of intellectual property rights. The increasing number of cases that have been brought before courts all over the world puts forward questions on the conflict of fundamental freedoms of speech and access to information on the one hand, and personality rights of individuals— under a broader right of informational self-determination—on the other. In the light of the recent judgment of the Court of Justice of the European Union (EU) in Google Spain v AEPD, this article concludes that many requests for removal of suggestions including private individuals’ information will be successful on the basis of EU data protection law, even absent prejudice to the person concerned.
Resumo:
Human Body Thermoregulation Models have been widely used in the field of human physiology or thermal comfort studies. However there are few studies on the evaluation method for these models. This paper summarises the existing evaluation methods and critically analyses the flaws. Based on that, a method for the evaluating the accuracy of the Human Body Thermoregulation models is proposed. The new evaluation method contributes to the development of Human Body Thermoregulation models and validates their accuracy both statistically and empirically. The accuracy of different models can be compared by the new method. Furthermore, the new method is not only suitable for the evaluation of Human Body Thermoregulation Models, but also can be theoretically applied to the evaluation of the accuracy of the population-based models in other research fields.
Resumo:
In numerical weather prediction, parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known; and the parameterisations themselves are also approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation (DA), such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential DA methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to pre-determined functional forms of missing physics or parameterisations that are based upon prior information. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. Furthermore, it is shown how the method depends on the quality of the DA results. The results indicate that this new method is a powerful tool in systematic model improvement.
Resumo:
Organic fertilizers based on seaweed extract potentially have beneficial effects on many crop plants. Herewe investigate the impact of organic fertilizer on Rosmarinus officinalis measured by both yield and oilquality. Plants grown in a temperature-controlled greenhouse with a natural photoperiod and a controlledirrigation system were treated with seaweed fertilizer and an inorganic fertilizer of matching mineralcomposition but with no organic content. Treatments were either by spraying on to the foliage or wateringdirect to the compost. The essential oil was extracted by hydro-distillation with a Clevenger apparatusand analysed by gas-chromatography mass-spectrometry (GC–MS) and NMR. The chemical composi-tions of the plants were compared, and qualitative differences were found between fertilizer treatmentsand application methods. Thus sprayed seaweed fertilizer showed a significantly higher percentage of�-pinene, �-phellandrene, �-terpinene (monoterpenes) and 3-methylenecycloheptene than other treat-ments. Italicene, �-bisabolol (sesquiterpenes), �-thujene, and E-isocitral (monoterpenes) occurred insignificantly higher percentages for plants watered with the seaweed extract. Each was significantly dif-ferent to the inorganic fertilizer and to controls. The seaweed treatments caused a significant increasein oil amount and leaf area as compared with both inorganic treatments and the control regardless ofapplication method.
Resumo:
Tropospheric ozone is an air pollutant thought to reduce crop yields across Europe. Much experimental scientific work has been completed or is currently underway to quantify yield effects at ambient ozone levels. In this research, we seek to directly evaluate whether such effects are observed at the farm level. This is done by intersecting a farm level panel dataset for winter wheat farms in England & Wales with information on ambient ozone, and estimating a production function with ozone as a fixed input. Panel data methods, Generalised Method of Moments (GMM) techniques and nested exogeneity tests are employed in the estimation. The results confirm a small, but nevertheless statistically significant negative effect of ambient ozone levels on wheat yields.
Resumo:
Subsidised energy prices in pre-transition Hungary had led to excessive energy intensity in the agricultural sector. Transition has resulted in steep input price increases. In this study, Allen and Morishima elasticities of substitution are estimated to study the effects of these price changes on energy use, chemical input use, capital formation and employment. Panel data methods, Generalised Method of Moments (GMM) and instrument exogeneity tests are used to specify and estimate technology and substitution elasticities. Results indicate that indirect price policy may be effective in controlling energy consumption. The sustained increases in energy and chemical input prices have worked together to restrict energy and chemical input use, and the substitutability between energy, capital and labour has prevented the capital shrinkage and agricultural unemployment situations from being worse. The Hungarian push towards lower energy intensity may be best pursued through sustained energy price increases rather than capital subsidies. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
In this paper we evaluate the relative influence of external versus domestic inflation drivers in the 12 new European Union (EU) member countries. Our empirical analysis is based on the New Keynesian Phillips Curve (NKPC) derived in Galí and Monacelli (2005) for small open economies (SOE). Employing the generalized method of moments (GMM), we find that the SOE NKPC is well supported in the new EU member states. We also find that the inflation process is dominated by domestic variables in the larger countries of our sample, whereas external variables are mostly relevant in the smaller countries.
Resumo:
A novel Linear Hashtable Method Predicted Hexagonal Search (LHMPHS) method for block based motion compensation is proposed. Fast block matching algorithms use the origin as the initial search center, which often does not track motion very well. To improve the accuracy of the fast BMA's, we employ a predicted starting search point, which reflects the motion trend of the current block. The predicted search centre is found closer to the global minimum. Thus the center-biased BMA's can be used to find the motion vector more efficiently. The performance of the algorithm is evaluated by using standard video sequences, considers the three important metrics: The results show that the proposed algorithm enhances the accuracy of current hexagonal algorithms and is better than Full Search, Logarithmic Search etc.
Resumo:
Liquid clouds play a profound role in the global radiation budget but it is difficult to remotely retrieve their vertical profile. Ordinary narrow field-of-view (FOV) lidars receive a strong return from such clouds but the information is limited to the first few optical depths. Wideangle multiple-FOV lidars can isolate radiation scattered multiple times before returning to the instrument, often penetrating much deeper into the cloud than the singly-scattered signal. These returns potentially contain information on the vertical profile of extinction coefficient, but are challenging to interpret due to the lack of a fast radiative transfer model for simulating them. This paper describes a variational algorithm that incorporates a fast forward model based on the time-dependent two-stream approximation, and its adjoint. Application of the algorithm to simulated data from a hypothetical airborne three-FOV lidar with a maximum footprint width of 600m suggests that this approach should be able to retrieve the extinction structure down to an optical depth of around 6, and total opticaldepth up to at least 35, depending on the maximum lidar FOV. The convergence behavior of Gauss-Newton and quasi-Newton optimization schemes are compared. We then present results from an application of the algorithm to observations of stratocumulus by the 8-FOV airborne “THOR” lidar. It is demonstrated how the averaging kernel can be used to diagnose the effective vertical resolution of the retrieved profile, and therefore the depth to which information on the vertical structure can be recovered. This work enables exploitation of returns from spaceborne lidar and radar subject to multiple scattering more rigorously than previously possible.