33 resultados para Free volume


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The output of a laser is a high frequency propagating electromagnetic field with superior coherence and brightness compared to that emitted by thermal sources. A multitude of different types of lasers exist, which also translates into large differences in the properties of their output. Moreover, the characteristics of the electromagnetic field emitted by a laser can be influenced from the outside, e.g., by injecting an external optical field or by optical feedback. In the case of free-running solitary class-B lasers, such as semiconductor and Nd:YVO4 solid-state lasers, the phase space is two-dimensional, the dynamical variables being the population inversion and the amplitude of the electromagnetic field. The two-dimensional structure of the phase space means that no complex dynamics can be found. If a class-B laser is perturbed from its steady state, then the steady state is restored after a short transient. However, as discussed in part (i) of this Thesis, the static properties of class-B lasers, as well as their artificially or noise induced dynamics around the steady state, can be experimentally studied in order to gain insight on laser behaviour, and to determine model parameters that are not known ab initio. In this Thesis particular attention is given to the linewidth enhancement factor, which describes the coupling between the gain and the refractive index in the active material. A highly desirable attribute of an oscillator is stability, both in frequency and amplitude. Nowadays, however, instabilities in coupled lasers have become an active area of research motivated not only by the interesting complex nonlinear dynamics but also by potential applications. In part (ii) of this Thesis the complex dynamics of unidirectionally coupled, i.e., optically injected, class-B lasers is investigated. An injected optical field increases the dimensionality of the phase space to three by turning the phase of the electromagnetic field into an important variable. This has a radical effect on laser behaviour, since very complex dynamics, including chaos, can be found in a nonlinear system with three degrees of freedom. The output of the injected laser can be controlled in experiments by varying the injection rate and the frequency of the injected light. In this Thesis the dynamics of unidirectionally coupled semiconductor and Nd:YVO4 solid-state lasers is studied numerically and experimentally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yhteenveto: Järvijään paksuus ja volyymi Suomessa jaksolla 1961-90

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Koskenniemen Äärellistilaisen leikkauskieliopin (FSIG) lauseopilliset rajoitteet ovat loogisesti vähemmän kompleksisia kuin mihin niissä käytetty formalismi vittaisi. Osoittautuukin että vaikka Voutilaisen (1994) englannin kielelle laatima FSIG-kuvaus käyttää useita säännöllisten lausekkeiden laajennuksia, kieliopin kuvaus kokonaisuutenaan palautuu äärelliseen yhdistelmään unionia, komplementtia ja peräkkäinasettelua. Tämä on oleellinen parannus ENGFSIG:n descriptiiviseen kompleksisuuteen. Tulos avaa ovia FSIG-kuvauksen loogisten ominaisuuksien syvemmälle analyysille ja FSIG kuvausten mahdolliselle optimoinnillle. Todistus sisältää uuden kaavan, joka kääntää Koskenniemien rajoiteoperaation ilman markkerimerkkejä.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches. Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine. Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors. Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, I propose to analyze narrative theory from an epistemological standpoint. To do so, I will draw upon both Genettian narratology and what I would call, following Shigeyuki Kuroda, “non-communicational” theories of fictional narrative. In spite of their very unequal popularity, I consider these theories as objective, or, in other words, as debatable and ripe for rational analyses; one can choose between them. The article is made up of three parts. The first part concerns the object of narrative theory, or the narrative as a constructed object, both in narratology (where narrative is likened to a narrative discourse) and in non-communicational narrative theories (where fictional narrative and discourse are mutually exclusive categories). The second part takes up the question of how the claims of these theories do or do not lend themselves to falsification. In particular, Gérard Genette’s claim that “every narrative is, explicitly or not, ‘in the first person’”, will be considered, through the lens of Ann Banfield’s theory of free indirect style. In the third part the reductionism of narrative theory will be dealt with. This leads to a reflection on the role of narrative theory in the analysis of fictional narratives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transposed to media like film, drama, opera, music, and the visual arts, “narrative” is no longer characterized by either temporality or an act of telling, both required by earlier narratological theories. Transposed to other disciplines, “narrative” is often a substitute for “assumption”, “hypothesis”, a disguised ideological stance, a cognitive scheme, and even life itself. The potential for broadening the concept lay dormant in narratology, both in the double use of “narrative” for the medium-free fabula and for the medium-bound sjuzet, and in changing interpretations of “event”. Some advantages of the broad use of “narrative” are an evocation of commonalities among media and disciplines, an invitation to re-think the term within the originating discipline, a constructivist challenge to positivistic and foundational views, an emphasis on a plurality of competing “truths”, and an empowerment of minority voices. Conversely, disadvantages of the broad use are an illusion of sameness whenever the term is used and the obliteration of specificity. In a Wittgensteinian spirit, the essay agrees that concepts of narrative are mutually related by “family resemblance”, but wishes to probe the resemblances further. It thus postulates two necessary features: double temporality and a transmitting (or mediating) agency, and an additional cluster of variable optional characteristics. When the necessary features are not dominant, the configuration may have “narrative elements” but is not “a narrative”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study contributes to the neglect effect literature by looking at the relative trading volume in terms of value. The results for the Swedish market show a significant positive relationship between the accuracy of estimation and the relative trading volume. Market capitalisation and analyst coverage have in prior studies been used as proxies for neglect. These measures however, do not take into account the effort analysts put in when estimating corporate pre-tax profits. I also find evidence that the industry of the firm influence the accuracy of estimation. In addition, supporting earlier findings, loss making firms are associated with larger forecasting errors. Further, I find that the average forecast error increased in the year 2000 – in Sweden.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microchips for use in biomolecular analysis show a lot of promise for medical diagnostics and biomedical basic research. Among the potential advantages are more sensitive and faster analyses as well as reduced cost and sample consumption. Due to scaling laws, the surface are to volume ratios of microfluidic chips is very high. Because of this, tailoring the surface properties and surface functionalization are very important technical issues for microchip development. This thesis studies two different types of functional surfaces, surfaces for open surface capillary microfluidics and surfaces for surface assisted laser desorption ionization mass spectrometry, and combinations thereof. Open surface capillary microfluidics can be used to transport and control liquid samples on easily accessible open surfaces simply based on surface forces, without any connections to pumps or electrical power sources. Capillary filling of open partially wetting grooves is shown to be possible with certain geometries, aspect ratios and contact angles, and a theoretical model is developed to identify complete channel filling domains, as well as partial filling domains. On the other hand, partially wetting surfaces with triangular microstructures can be used for achieving directional wetting, where the water droplets do not spread isotropically, but instead only spread to a predetermined sector. Furthermore, by patterning completely wetting and superhydrophobic areas on the same surface, complex droplet shapes are achieved, as the water stretches to make contact with the wetting surface, but does not enter into the superhydrophobic domains. Surfaces for surface assisted laser desorption ionization mass spectrometry are developed by applying various active thin film coatings on multiple substrates, in order to separate surface and bulk effects. Clear differences are observed between both surface and substrate layers. The best performance surfaces consisted of amorphous silicon coating and an inorganic-organic hybrid substrate, with nanopillars and nanopores. These surfaces are used for matrix-free ionization of drugs, peptides and proteins, and for some analytes, the detection limits were in the high attomoles. Microfluidics and laser desorption ionization surfaces are combined on a functionalized drying platforms, where the surface is used to control the shape of the deposited analyte droplet, and the shape of the initial analyte droplet affects the dried droplet solute deposition pattern. The deposited droplets can then directly detected by mass spectrometry. Utilizing this approach, results of analyte concentration, splitting and separation are demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two methods of pre-harvest inventory were designed and tested on three cutting sites containing a total of 197 500 m3 of wood. These sites were located on flat-ground boreal forests located in northwestern Quebec. Both methods studied involved scaling of trees harvested to clear the road path one year (or more) prior to harvest of adjacent cut-blocks. The first method (ROAD) considers the total road right-of-way volume divided by the total road area cleared. The resulting volume per hectare is then multiplied by the total cut-block area scheduled for harvest during the following year to obtain the total estimated cutting volume. The second method (STRATIFIED) also involves scaling of trees cleared from the road. However, in STRATIFIED, log scaling data are stratified by forest stand location. A volume per hectare is calculated for each stretch of road that crosses a single forest stand. This volume per hectare is then multiplied by the remaining area of the same forest stand scheduled for harvest one year later. The sum of all resulting estimated volumes per stand gives the total estimated cutting-volume for all cut-blocks adjacent to the studied road. A third method (MNR) was also used to estimate cut-volumes of the sites studied. This method represents the actual existing technique for estimating cutting volume in the province of Quebec. It involves summing the cut volume for all forest stands. The cut volume is estimated by multiplying the area of each stand by its estimated volume per hectare obtained from standard stock tables provided by the governement. The resulting total estimated volume per cut-block for all three methods was then compared with the actual measured cut-block volume (MEASURED). This analysis revealed a significant difference between MEASURED and MNR methods with the MNR volume estimate being 30 % higher than MEASURED. However, no significant difference from MEASURED was observed for volume estimates for the ROAD and STRATIFIED methods which respectively had estimated cutting volumes 19 % and 5 % lower than MEASURED. Thus the ROAD and STRATIFIED methods are good ways to estimate cut-block volumes after road right-of-way harvest for conditions similar to those examined in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organocatalysis, the use of organic molecules as catalysts, is attracting increasing attention as one of the most modern and rapidly growing areas of organic chemistry, with countless research groups in both academia and the pharmaceutical industry around the world working on this subject. The literature review of this thesis mainly focuses on metal-free systems for hydrogen activation and organocatalytic reduction. Since these research topics are relatively new, the literature review also highlights the basic principles of the use of Lewis acid-Lewis base pairs, which do not react irreversibly with each other, as a trap for small molecules. The experimental section progresses from the first observation of the facile heterolytical cleavage of hydrogen gas by amines and B(C6F5)3 to highly active non-metal catalysts for both enantioselective and racemic hydrogenation of unsaturated nitrogen-containing compounds. Moreover, detailed studies of structure-reactivity relationships of these systems by X-ray, neutron diffraction, NMR methods and quantum chemical calculations were performed to gain further insight into the mechanism of hydrogen activation and hydrogenation by boron-nitrogen compounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

People with coeliac disease have to maintain a gluten-free diet, which means excluding wheat, barley and rye prolamin proteins from their diet. Immunochemical methods are used to analyse the harmful proteins and to control the purity of gluten-free foods. In this thesis, the behaviour of prolamins in immunological gluten assays and with different prolamin-specific antibodies was examined. The immunoassays were also used to detect residual rye prolamins in sourdough systems after enzymatic hydrolysis and wheat prolamins after deamidation. The aim was to characterize the ability of the gluten analysis assays to quantify different prolamins in varying matrices in order to improve the accuracy of the assays. Prolamin groups of cereals consist of a complex mixture of proteins that vary in their size and amino acid sequences. Two common characteristics distinguish prolamins from other cereal proteins. Firstly, they are soluble in aqueous alcohols, and secondly, most of the prolamins are mainly formed from repetitive amino acid sequences containing high amounts of proline and glutamine. The diversity among prolamin proteins sets high requirements for their quantification. In the present study, prolamin contents were evaluated using enzyme-linked immunosorbent assays based on ω- and R5 antibodies. In addition, assays based on A1 and G12 antibodies were used to examine the effect of deamidation on prolamin proteins. The prolamin compositions and the cross-reactivity of antibodies with prolamin groups were evaluated with electrophoretic separation and Western blotting. The results of this thesis research demonstrate that the currently used gluten analysis methods are not able to accurately quantify barley prolamins, especially when hydrolysed or mixed in oats. However, more precise results can be obtained when the standard more closely matches the sample proteins, as demonstrated with barley prolamin standards. The study also revealed that all of the harmful prolamins, i.e. wheat, barley and rye prolamins, are most efficiently extracted with 40% 1-propanol containing 1% dithiothreitol at 50 °C. The extractability of barley and rye prolamins was considerably higher with 40% 1-propanol than with 60% ethanol, which is typically used for prolamin extraction. The prolamin levels of rye were lowered by 99.5% from the original levels when an enzyme-active rye-malt sourdough system was used for prolamin degradation. Such extensive degradation of rye prolamins suggest the use of sourdough as a part of gluten-free baking. Deamidation increases the diversity of prolamins and improves their solubility and ability to form structures such as emulsions and foams. Deamidation changes the protein structure, which has consequences for antibody recognition in gluten analysis. According to the resuts of the present work, the analysis methods were not able to quantify wheat gluten after deamidation except at very high concentrations. Consequently, deamidated gluten peptides can exist in food products and remain undetected, and thus cause a risk for people with gluten intolerance. The results of this thesis demonstrate that current gluten analysis methods cannot accurately quantify prolamins in all food matrices. New information on the prolamins of rye and barley in addition to wheat prolamins is also provided in this thesis, which is essential for improving gluten analysis methods so that they can more accurately quantify prolamins from harmful cereals.