938 resultados para methods of resolution enhancement


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Credit risk assessment is an integral part of banking. Credit risk means that the return will not materialise in case the customer fails to fulfil its obligations. Thus a key component of banking is setting acceptance criteria for granting loans. Theoretical part of the study focuses on key components of credit assessment methods of Banks in the literature when extending credits to large corporations. Main component is Basel II Accord, which sets regulatory requirement for credit risk assessment methods of banks. Empirical part comprises, as primary source, analysis of major Nordic banks’ annual reports and risk management reports. As secondary source complimentary interviews were carried out with senior credit risk assessment personnel. The findings indicate that all major Nordic banks are using combination of quantitative and qualitative information in credit risk assessment model when extending credits to large corporations. The relative input of qualitative information depends on the selected approach to the credit rating, i.e. point-in-time or through-the-cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One approach to verify the adequacy of estimation methods of reference evapotranspiration is the comparison with the Penman-Monteith method, recommended by the United Nations of Food and Agriculture Organization - FAO, as the standard method for estimating ET0. This study aimed to compare methods for estimating ET0, Makkink (MK), Hargreaves (HG) and Solar Radiation (RS), with Penman-Monteith (PM). For this purpose, we used daily data of global solar radiation, air temperature, relative humidity and wind speed for the year 2010, obtained through the automatic meteorological station, with latitude 18° 91' 66" S, longitude 48° 25' 05" W and altitude of 869m, at the National Institute of Meteorology situated in the Campus of Federal University of Uberlandia - MG, Brazil. Analysis of results for the period were carried out in daily basis, using regression analysis and considering the linear model y = ax, where the dependent variable was the method of Penman-Monteith and the independent, the estimation of ET0 by evaluated methods. Methodology was used to check the influence of standard deviation of daily ET0 in comparison of methods. The evaluation indicated that methods of Solar Radiation and Penman-Monteith cannot be compared, yet the method of Hargreaves indicates the most efficient adjustment to estimate ETo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes) using Schirmer tear test (STT) strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT) strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001) were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT) strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein engineering aims to improve the properties of enzymes and affinity reagents by genetic changes. Typical engineered properties are affinity, specificity, stability, expression, and solubility. Because proteins are complex biomolecules, the effects of specific genetic changes are seldom predictable. Consequently, a popular strategy in protein engineering is to create a library of genetic variants of the target molecule, and render the population in a selection process to sort the variants by the desired property. This technique, called directed evolution, is a central tool for trimming protein-based products used in a wide range of applications from laundry detergents to anti-cancer drugs. New methods are continuously needed to generate larger gene repertoires and compatible selection platforms to shorten the development timeline for new biochemicals. In the first study of this thesis, primer extension mutagenesis was revisited to establish higher quality gene variant libraries in Escherichia coli cells. In the second study, recombination was explored as a method to expand the number of screenable enzyme variants. A selection platform was developed to improve antigen binding fragment (Fab) display on filamentous phages in the third article and, in the fourth study, novel design concepts were tested by two differentially randomized recombinant antibody libraries. Finally, in the last study, the performance of the same antibody repertoire was compared in phage display selections as a genetic fusion to different phage capsid proteins and in different antibody formats, Fab vs. single chain variable fragment (ScFv), in order to find out the most suitable display platform for the library at hand. As a result of the studies, a novel gene library construction method, termed selective rolling circle amplification (sRCA), was developed. The method increases mutagenesis frequency close to 100% in the final library and the number of transformants over 100-fold compared to traditional primer extension mutagenesis. In the second study, Cre/loxP recombination was found to be an appropriate tool to resolve the DNA concatemer resulting from error-prone RCA (epRCA) mutagenesis into monomeric circular DNA units for higher efficiency transformation into E. coli. Library selections against antigens of various size in the fourth study demonstrated that diversity placed closer to the antigen binding site of antibodies supports generation of antibodies against haptens and peptides, whereas diversity at more peripheral locations is better suited for targeting proteins. The conclusion from a comparison of the display formats was that truncated capsid protein three (p3Δ) of filamentous phage was superior to the full-length p3 and protein nine (p9) in obtaining a high number of uniquely specific clones. Especially for digoxigenin, a difficult hapten target, the antibody repertoire as ScFv-p3Δ provided the clones with the highest affinity for binding. This thesis on the construction, design, and selection of gene variant libraries contributes to the practical know-how in directed evolution and contains useful information for scientists in the field to support their undertakings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A high-frequency cyclonverter acts as a direct ac-to-ac power converter circuit that does not require a diode bidge rectifier. Bridgeless topology makes it possible to remove forward voltage drop losses that are present in a diode bridge. In addition, the on-state losses can be reduced to 1.5 times the on-state resistance of switches in half-bridge operation of the cycloconverter. A high-frequency cycloconverter is reviewed and the charging effect of the dc-capacitors in ``back-to-back'' or synchronous mode operation operation is analyzed. In addition, a control method is introduced for regulating dc-voltage of the ac-side capacitors in synchronous operation mode. The controller regulates the dc-capacitors and prevents switches from reaching overvoltage level. This can be accomplished by variating phase-shift between the upper and the lower gate signals. By adding phase-shift between the gate signal pairs, the charge stored in the energy storage capacitors can be discharged through the resonant load and substantially, the output resonant current amplitude can be improved. The above goals are analyzed and illustrated with simulation. Theory is supported with practical measurements where the proposed control method is implemented in an FPGA device and tested with a high-frequency cycloconverter using super-junction power MOSFETs as switching devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Results of subgroup analysis (SA) reported in randomized clinical trials (RCT) cannot be adequately interpreted without information about the methods used in the study design and the data analysis. Our aim was to show how often inaccurate or incomplete reports occur. First, we selected eight methodological aspects of SA on the basis of their importance to a reader in determining the confidence that should be placed in the author's conclusions regarding such analysis. Then, we reviewed the current practice of reporting these methodological aspects of SA in clinical trials in four leading journals, i.e., the New England Journal of Medicine, the Journal of the American Medical Association, the Lancet, and the American Journal of Public Health. Eight consecutive reports from each journal published after July 1, 1998 were included. Of the 32 trials surveyed, 17 (53%) had at least one SA. Overall, the proportion of RCT reporting a particular methodological aspect ranged from 23 to 94%. Information on whether the SA preceded/followed the analysis was reported in only 7 (41%) of the studies. Of the total possible number of items to be reported, NEJM, JAMA, Lancet and AJPH clearly mentioned 59, 67, 58 and 72%, respectively. We conclude that current reporting of SA in RCT is incomplete and inaccurate. The results of such SA may have harmful effects on treatment recommendations if accepted without judicious scrutiny. We recommend that editors improve the reporting of SA in RCT by giving authors a list of the important items to be reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to evaluate the carcass suspension method concerning quality of sheep meat. Ten discard ewes were used, with approximately 62 kg of body weight. After slaughtering, flaying, evisceration and removal of head and paws, carcasses were longitudinally divided into two parts. Alternated sides of half carcasses were hanged by the tendon of the gastrocnemius (Treatment 1 - T1) and by the pelvic bone (Treatment 2 - T2) in cold store for a 24-hour period. Subsequently, the Semimembranosus muscle was removed from all half carcasses for meat quality analyses. The Semimembranosus muscles from the carcasses hanged by the pelvis suspension method presented higher softness than the same muscles from the carcasses hanged by the tendon of the gastrocnemius, with values of 1.99 kgf.cm-2 and 3.15 kgf.cm-2, respectively. Treatment 2 presented lower meat cooking losses than Treatment 1, with average values of 32.14 and 33.44%, respectively. The remaining meat quality parameters evaluated were not influenced by the carcass suspension method. We concluded that the carcass suspension method influenced meat softness and losses by cooking, with better results for carcasses hanged by the pelvic bone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seed dormancy is a frequent phenomenon in tropical species, causing slow and non-uniform germination. To overcome this, treatments such as scarification on abrasive surface and hot water are efficient. The objective of this study was to quantify seed germination with no treatment (Experiment 1) and identify an efficient method of breaking dormancy in Schizolobium amazonicum Huber ex Ducke seeds (Experiment 2). The effects of manual scarification on electric emery, water at 80ºC and 100ºC and manual scarification on wood sandpaper were studied. Seeds were sown either immediately after scarification or after immersion in water for 24h in a sand and sawdust mixture. Germination and hard seed percentages and germination speed were recorded and analyzed in a completely randomized design. Analysis of germination was carried out at six, nine, 12, 15, 18, 21 and 24 days after sowing as a 4x2 factorial design and through regression analysis. Treatment means of the remaining variables were compared by the Tukey test. Seed germination with no treatment started on the 7th day after sowing and reached 90% on the 2310th day (Experiment 1). Significant interaction between treatments to overcome dormancy and time of immersion in water was observed (Experiment 2). In general, immersion in water increased the germination in most evaluations. The regression analyses were significant for all treatments with exception of the control treatment and immersion in water at 80ºC. Germination speed was higher when seeds were scarified on an abrasive surface (emery and sandpaper) and, in these treatments, the germination ranged from 87% to 96%, with no hard seeds. S. amazonicum seeds coats are impermeable to water, which hinders quick and uniform germination. Scarification on electric emery followed by immediate sowing, scarification on sandpaper followed by immediate sowing and sowing after 24h were the most efficient treatments for overcoming dormancy in S. amazonicum seeds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to determine novice t~ache~s' perceptions of th~ extent to which the Brock University teacher education program focused on strategies for promoting responsibility in students. Individual interviews were conducted with ten randomly selected teachers who were graduates of this teacher education program between the years of 1989 and 1992, and a follow-up group discussion activity, with the same teachers, was also held. Findings revealed that the topic of personal responsibility was discussed within various components of the program, including counselling group sessions, but that these discussions were often brief, indirect and inconsistent. Some of the strategies which the teachers used in their own classrooms to promote responsibility in students were ones which they had acquired from those counselling group °sessions or from associate teachers. Various strategies included: setting ~lear expectations of students with positive and negative consequences for behaviour (e.g., material rewards and detentions, respectively), cemmunic?ting'with other teachers an~ parents, and -. suspending students from school. A teacher's choice of any particular strategy seemed to be affected by his or her personality, teaching sUbject and region of employment, as well as certain aspects of the teacher education program. It was concluded that many of the teachers appeared to be controlling rude and vio~ent- behaviour, as opposed to promoting responsible behaviour. Recommendations were made for the pre-service program, as well as induction and inservice programs, to increase teacher preparedness for promoting responsible student behaviour. One of these recommendations addressed the need to help teachers learn how to effectively communicate with their students.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’émergence de l’utilisation du méthylphénidate (MPH; Ritalin) par des étudiants universitaires afin d’améliorer leur concentration et leurs performances universitaires suscite l’intérêt du public et soulève d’importants débats éthiques auprès des spécialistes. Les différentes perspectives sur l’amélioration des performances cognitives représentent une dimension importante des défis sociaux et éthiques autour d’un tel phénomène et méritent d’être élucidées. Ce mémoire vise à examiner les discours présents dans les reportages internationaux de presse populaire, les discours en bioéthique et en en santé publique sur le thème de l’utilisation non médicale du méthylphénidate. Cette recherche a permis d’identifier et d’analyser des « lacunes » dans les perspectives éthiques, sociales et scientifiques de l’utilisation non médicale du méthylphénidate pour accroître la performance cognitive d’individus en santé. Une analyse systématique du contenu des discours sur l’utilisation non médicale du méthylphénidate pour accroître la performance cognitive a identifié des paradigmes divergents employés pour décrire l’utilisation non médicale du méthylphénidate et discuter ses conséquences éthiques. Les paradigmes « choix de mode de vie », « abus de médicament » et « amélioration de la cognition » sont présents dans les discours de la presse populaire, de la bioéthique et de la santé publique respectivement. Parmi les principales différences entre ces paradigmes, on retrouve : la description de l’utilisation non médicale d’agents neuropharmacologiques pour l’amélioration des performances, les risques et bénéfices qui y sont associés, la discussion d’enjeux éthiques et sociaux et des stratégies de prévention et les défis associés à l’augmentation de la prévalence de ce phénomène. La divergence de ces paradigmes reflète le pluralisme des perceptions de l’utilisation non médicale d’agents neuropharmacologiques Nos résultats suggèrent la nécessité de débats autour de l’amélioration neuropharmacologique afin de poursuivre l’identification des enjeux et de développer des approches de santé publique cohérentes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ultrafast laser pulses have become an integral part of the toolbox of countless laboratories doing physics, chemistry, and biological research. The work presented here is motivated by a section in the ever-growing, interdisciplinary research towards understanding the fundamental workings of light-matter interactions. Specifically, attosecond pulses can be useful tools to obtain the desired insight. However access to, and the utility of, such pulses is dependent on the generation of intense, few-cycle, carrier-envelope-phase stabilized laser pulses. The presented work can be thought of as a sort of roadmap towards the latter. From the oscillator which provides the broadband seed to amplification methods, the integral pieces necessary for the generation of attosecond pulses are discussed. A range of topics from the fundamentals to design challenges is presented, outfitting the way towards the practical implementation of an intense few-cycle carrier-envelope-phase stabilized laser source.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lecture notes for a course on methods of mathematical physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses a study to compare two tests of loss of capacity to hear speech.