997 resultados para weak-strong uniqueness
Resumo:
Esta es la versión no revisada del artículo: Inmaculada Higueras, Natalie Happenhofer, Othmar Koch, and Friedrich Kupka. 2014. Optimized strong stability preserving IMEX Runge-Kutta methods. J. Comput. Appl. Math. 272 (December 2014), 116-140. Se puede consultar la versión final en https://doi.org/10.1016/j.cam.2014.05.011
Resumo:
This paper shows preliminary results of research into the occurrence of strong anticyclonic systems that influenced the weather in Poland during the period 1971–2000. The study was based on NCEP/NCAR reanalysis data, including daily values of the 1000 and 500 hPa geopotential heights, maps of mentioned geopotential heights and maps of sea-level field pressure. With the use of these data a number of exceptionally strong high-pressure systems were identified, together with their areas of origin and subsequent development patterns. They were then broken down into five groups with similar dynamics. The numbers of systems in each group were not found to follow any significant change trends in the long term. The greatest differences between groups were identified in terms of their annual occurrence rates and centre pressure values.
Resumo:
Stabilized micron-sized bubbles, known as contrast agents, are often injected into the body to enhance ultrasound imaging of blood flow. The ability to detect such bubbles in blood depends on the relative magnitude of the acoustic power backscattered from the microbubbles (‘signal’) to the power backscattered from the red blood cells (‘noise’). Erythrocytes are acoustically small (Rayleigh regime), weak scatterers, and therefore the backscatter coefficient (BSC) of blood increases as the fourth power of frequency throughout the diagnostic frequency range. Microbubbles, on the other hand, are either resonant or super-resonant in the range 5-30 MHz. Above resonance, their total scattering cross-section remains constant with increasing frequency. In the present thesis, a theoretical model of the BSC of a suspension of red blood cells is presented and compared to the BSC of Optison® contrast agent microbubbles. It is predicted that, as the frequency increases, the BSC of red blood cell suspensions eventually exceeds the BSC of the strong scattering microbubbles, leading to a dramatic reduction in signal-to-noise ratio (SNR). This decrease in SNR with increasing frequency was also confirmed experimentally by use of an active cavitation detector for different concentrations of Optison® microbubbles in erythrocyte suspensions of different hematocrits. The magnitude of the observed decrease in SNR correlated well with theoretical predictions in most cases, except for very dense suspensions of red blood cells, where it is hypothesized that the close proximity of erythrocytes inhibits the acoustic response of the microbubbles.
Resumo:
Two new notions of reduction for terms of the λ-calculus are introduced and the question of whether a λ-term is beta-strongly normalizing is reduced to the question of whether a λ-term is merely normalizing under one of the new notions of reduction. This leads to a new way to prove beta-strong normalization for typed λ-calculi. Instead of the usual semantic proof style based on Girard's "candidats de réductibilité'', termination can be proved using a decreasing metric over a well-founded ordering in a style more common in the field of term rewriting. This new proof method is applied to the simply-typed λ-calculus and the system of intersection types.
Resumo:
This is an addendum to our technical report BUCS TR-94-014 of December 19, 1994. It clarifies some statements, adds information on some related research, includes a comparison with research be de Groote, and fixes two minor mistakes in a proof.
Resumo:
This paper formally defines the operational semantic for TRAFFIC, a specification language for flow composition applications proposed in BUCS-TR-2005-014, and presents a type system based on desired safety assurance. We provide proofs on reduction (weak-confluence, strong-normalization and unique normal form), on soundness and completeness of type system with respect to reduction, and on equivalence classes of flow specifications. Finally, we provide a pseudo-code listing of a syntax-directed type checking algorithm implementing rules of the type system capable of inferring the type of a closed flow specification.
Resumo:
Weak references are references that do not prevent the object they point to from being garbage collected. Most realistic languages, including Java, SML/NJ, and OCaml to name a few, have some facility for programming with weak references. Weak references are used in implementing idioms like memoizing functions and hash-consing in order to avoid potential memory leaks. However, the semantics of weak references in many languages are not clearly specified. Without a formal semantics for weak references it becomes impossible to prove the correctness of implementations making use of this feature. Previous work by Hallett and Kfoury extends λgc, a language for modeling garbage collection, to λweak, a similar language with weak references. Using this previously formalized semantics for weak references, we consider two issues related to well-behavedness of programs. Firstly, we provide a new, simpler proof of the well-behavedness of the syntactically restricted fragment of λweak defined previously. Secondly, we give a natural semantic criterion for well-behavedness much broader than the syntactic restriction, which is useful as principle for programming with weak references. Furthermore we extend the result, proved in previously of λgc, which allows one to use type-inference to collect some reachable objects that are never used. We prove that this result holds of our language, and we extend this result to allow the collection of weakly-referenced reachable garbage without incurring the computational overhead sometimes associated with collecting weak bindings (e.g. the need to recompute a memoized function). Lastly we use extend the semantic framework to model the key/value weak references found in Haskell and we prove the Haskell is semantics equivalent to a simpler semantics due to the lack of side-effects in our language.
Resumo:
A novel spectroscopic method, incoherent broadband cavity enhanced absorption spectroscopy (IBBCEAS), has been modified and extended to measure absorption spectra in the near-ultraviolet with high sensitivity. The near-ultraviolet region extends from 300 to 400 nm and is particularly important in tropospheric photochemistry; absorption of near-UV light can also be exploited for sensitive trace gas measurements of several key atmospheric constituents. In this work, several IBBCEAS instruments were developed to record reference spectra and to measure trace gas concentrations in the laboratory and field. An IBBCEAS instrument was coupled to a flow cell for measuring very weak absorption spectra between 335 and 375 nm. The instrument was validated against the literature absorption spectrum of SO2. Using the instrument, we report new absorption cross-sections of O3, acetone, 2-butanone, and 2-pentanone in this spectral region, where literature data diverge considerably owing to the extremely weak absorption. The instrument was also applied to quantifying low concentrations of the short-lived radical, BrO, in the presence of strong absorption by Br2 and O3. A different IBBCEAS system was adapted to a 4 m3 atmosphere simulation chamber to record the absorption cross-sections of several low vapour pressure compounds, which are otherwise difficult to measure. Absorption cross-sections of benzaldehyde and the more volatile alkyl nitrites agree well with previous spectra; on this basis, the cross-sections of several nitrophenols are reported for the first time. In addition, the instrument was also used to study the optical properties of secondary organic aerosol formed following the photooxidation of isoprene. An extractive IBBCEAS instrument was developed for detecting HONO and NO2 and had a sensitivity of about 10-9 cm-1. This instrument participated in a major international intercomparison of HONO and NO2 measurements held in the EUPHORE simulation chamber in Valencia, Spain, and results from that campaign are also reported here.
Resumo:
The application of sourdough can improve texture, structure, nutritional value, staling rate and shelf life of wheat and gluten-free breads. These quality improvements are associated with the formation of organic acids, exopolysaccharides (EPS), aroma or antifungal compounds. Initially, the suitability of two lactic acid bacteria strains to serve as sourdough starters for buckwheat, oat, quinoa, sorghum and flours was investigated. Wheat flour was chosen as a reference. The obligate heterofermentative lactic acid bacterium (LAB) Weissella cibaria MG1 (Wc) formed the EPS dextran (a α-1,6-glucan) from sucrose in situ with a molecular size of 106 to 107 kDa. EPS formation in all breads was analysed using size exclusion chromatography and highest amounts were formed in buckwheat (4 g/ kg) and quinoa sourdough (3 g/ kg). The facultative heterofermentative Lactobacillus plantarum FST1.7 (Lp) was identified as strong acidifier and was chosen due to its ubiquitous presence in gluten-free as well as wheat sourdoughs (Vogelmann et al. 2009). Both Wc and Lp, showed highest total titratable acids in buckwheat (16.8 ml; 26.0 ml), teff (16.2 ml; 24.5 ml) and quinoa sourdoughs (26.4 ml; 35.3 ml) correlating with higher amounts of fermentable sugars and higher buffering capacities. Sourdough incorporation reduced the crumb hardness after five days of storage in buckwheat (Wc -111%), teff (Wc -39%) and wheat (Wc -206%; Lp -118%) sourdough breads. The rate of staling (N/ day) was reduced in buckwheat (Ctrl 8 N; Wc 3 N; Lp 6 N), teff (Ctrl 13 N; Wc 9 N; Lp 10 N) and wheat (Ctrl 5 N; Wc 1 N; Lp 2 N) sourdough breads. Bread dough softening upon Wc and Lp sourdough incorporation accounted for increased crumb porosity in buckwheat (+10.4%; +4.7), teff (+8.1%; +8.3%) and wheat sourdough breads (+8.7%; +6.4%). Weissella cibaria MG1 sourdough improved the aroma quality of wheat bread but had no impact on aroma of gluten-free breads. Microbial shelf life however, was not prolonged in any of the breads regardless of the starter culture used. Due to the high prevalence of insulin-dependent diabetes mellitus particular amongst coeliac patients, glycaemic control is of great (Berti et al. 2004). The in vitro starch digestibility of gluten-free breads with and without sourdough addition was analysed to predict the GI (pGI). Sourdough can decrease starch hydrolysis in vitro, due to formation of resistant starch and organic acids. Predicted GI of gluten-free control breads were significantly lower than for the reference white wheat bread (GI=100). Starch granule size was investigated with scanning electron microscopy and was significantly smaller in quinoa flour (<2 μm). This resulted in higher enzymatic susceptibility and hence higher pGI for quinoa bread (95). Lowest hydrolysis indexes for sorghum and teff control breads (72 and 74, respectively) correlate with higher gelatinisation peak temperatures (69°C and 71°C, respectively). Levels of resistant starch were not increased by addition of Weissella cibaria MG1 (weak acidifier) or Lactobacillus plantarum FST1.7 (strong acidifier). The pGI was significantly decreased for both wheat sourdough breads (Wc 85; Lp 76). Lactic acid can promote starch interactions with gluten hence decreasing starch susceptibility (Östman et al. 2002). For most gluten-free breads, the pGI was increased upon sourdough addition. Only sorghum and teff Lp sourdough breads (69 and 68, respectively) had significantly decreased pGI. Results suggest that the increase of starch hydrolysis in gluten-free breads was related to mechanism other than presence of organic acids and formation of resistant starch.
Resumo:
Coastal lagoons are defined as shallow coastal water bodies partially separated from the adjacent sea by a restrictive barrier. Coastal lagoons are protected under Annex I of the European Habitats Directive (92/43/EEC). Lagoons are also considered to be “transitional water bodies” and are therefore included in the “register of protected areas” under the Water Framework Directive (2000/60/EC). Consequently, EU member states are required to establish monitoring plans and to regularly report on lagoon condition and conservation status. Irish lagoons are considered relatively rare and unusual because of their North Atlantic, macrotidal location on high energy coastlines and have received little attention. This work aimed to assess the physicochemical and ecological status of three lagoons, Cuskinny, Farranamanagh and Toormore, on the southwest coast of Ireland. Baseline salinity, nutrient and biological conditions were determined in order to provide reference conditions to detect perturbations, and to inform future maintenance of ecosystem health. Accumulation of organic matter is an increasing pressure in coastal lagoon habitats worldwide, often compounding existing eutrophication problems. This research also aimed to investigate the in situ decomposition process in a lagoon habitat together with exploring the associated invertebrate assemblages. Re-classification of the lagoons, under the guidelines of the Venice system for the classifications of marine waters according to salinity, was completed by taking spatial and temporal changes in salinity regimes into consideration. Based on the results of this study, Cuskinny, Farranamanagh and Toormore lagoons are now classified as mesohaline (5 ppt – 18 ppt), oligohaline (0.5 ppt – 5 ppt) and polyhaline (18 ppt – 30 ppt), respectively. Varying vertical, longitudinal and transverse salinity patterns were observed in the three lagoons. Strong correlations between salinity and cumulative rainfall highlighted the important role of precipitation in controlling the lagoon environment. Maximum effect of precipitation on the salinity of the lagoon was observed between four and fourteen days later depending on catchment area geology, indicating the uniqueness of each lagoon system. Seasonal nutrient patterns were evident in the lagoons. Nutrient concentrations were found to be reflective of the catchment area and the magnitude of the freshwater inflow. Assessment based on the Redfield molar ratio indicated a trend towards phosphorus, rather than nitrogen, limitation in Irish lagoons. Investigation of the decomposition process in Cuskinny Lagoon revealed that greatest biomass loss occurred in the winter season. Lowest biomass loss occurred in spring, possibly due to the high density of invertebrates feeding on the thick microbial layer rather than the decomposing litter. It has been reported that the decomposition of plant biomass is highest in the preferential distribution area of the plant species; however, no similar trend was observed in this study with the most active zones of decomposition varying spatially throughout the seasons. Macroinvertebrate analysis revealed low species diversity but high abundance, indicating the dominance of a small number of species. Invertebrate assemblages within the lagoon varied significantly from communities in the adjacent freshwater or marine environments. Although carried out in coastal lagoons on the southwest coast of Ireland, it is envisaged that the overall findings of this study have relevance throughout the entire island of Ireland and possibly to many North Atlantic coastal lagoon ecosystems elsewhere.
Resumo:
This paper looks into economic insights offerred by considerations of two important financial markets in Vietnam, gold and USD. In general, the paper focuses on time series properties, mainly returns at different frequencies, and test the weak-form efficient market hypothesis. All the test rejects the efficiency of both gold and foreign exchange markets. All time series exhibit strong serial correlations. ARMA-GARCH specifications appear to have performed well with different time series. In all cases the changing volatility phenomenon is strongly supported through empirical data. An additional test is performed on the daily USD return to try to capture the impacts of Asian financial crisis and daily price limits applicable. No substantial impacts of the Asian crisis and the central bank-devised limits are found to influence the risk level of daily USD return.
Resumo:
We have used analytical ultracentrifugation to characterize the binding of the methionine repressor protein, MetJ, to synthetic oligonucleotides containing zero to five specific recognition sites, called metboxes. For all lengths of DNA studied, MetJ binds more tightly to repeats of the consensus sequence than to naturally occurring metboxes, which exhibit a variable number of deviations from the consensus. Strong cooperative binding occurs only in the presence of two or more tandem metboxes, which facilitate protein-protein contacts between adjacent MetJ dimers, but weak affinity is detected even with DNA containing zero or one metbox. The affinity of MetJ for all of the DNA sequences studied is enhanced by the addition of SAM, the known cofactor for MetJ in the cell. This effect extends to oligos containing zero or one metbox, both of which bind two MetJ dimers. In the presence of a large excess concentration of metbox DNA, the effect of cooperativity is to favor populations of DNA oligos bound by two or more MetJ dimers rather than a stochastic redistribution of the repressor onto all available metboxes. These results illustrate the dynamic range of binding affinity and repressor assembly that MetJ can exhibit with DNA and the effect of the corepressor SAM on binding to both specific and nonspecific DNA.
Resumo:
BACKGROUND: A candidate vaccine consisting of human immunodeficiency virus type 1 (HIV-1) subunit gp120 protein was found previously to be nonprotective in an efficacy trial (Vax004) despite strong antibody responses against the vaccine antigens. Here we assessed the magnitude and breadth of neutralizing antibody responses in Vax004. METHODS: Neutralizing antibodies were measured against highly sensitive (tier 1) and moderately sensitive (tier 2) strains of HIV-1 subtype B in 2 independent assays. Vaccine recipients were stratified by sex, race, and high versus low behavioral risk of HIV-1 acquisition. RESULTS: Most vaccine recipients mounted potent neutralizing antibody responses against HIV-1(MN) and other tier 1 viruses. Occasional weak neutralizing activity was detected against tier 2 viruses. The response against tier 1 and tier 2 viruses was significantly stronger in women than in men. Race and behavioral risk of HIV-1 acquisition had no significant effect on the response. Prior vaccination had little effect on the neutralizing antibody response that arose after infection. CONCLUSIONS: Weak overall neutralizing antibody responses against tier 2 viruses is consistent with a lack of protection in this trial. The magnitude and breadth of neutralization reported here should be useful for identifying improved vaccines.
Resumo:
We describe a general technique for determining upper bounds on maximal values (or lower bounds on minimal costs) in stochastic dynamic programs. In this approach, we relax the nonanticipativity constraints that require decisions to depend only on the information available at the time a decision is made and impose a "penalty" that punishes violations of nonanticipativity. In applications, the hope is that this relaxed version of the problem will be simpler to solve than the original dynamic program. The upper bounds provided by this dual approach complement lower bounds on values that may be found by simulating with heuristic policies. We describe the theory underlying this dual approach and establish weak duality, strong duality, and complementary slackness results that are analogous to the duality results of linear programming. We also study properties of good penalties. Finally, we demonstrate the use of this dual approach in an adaptive inventory control problem with an unknown and changing demand distribution and in valuing options with stochastic volatilities and interest rates. These are complex problems of significant practical interest that are quite difficult to solve to optimality. In these examples, our dual approach requires relatively little additional computation and leads to tight bounds on the optimal values. © 2010 INFORMS.