961 resultados para Süleyman I, Sultan of the Turks, 1494 or 5-1566.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last decade the use of randomised gene libraries has had an enormous impact in the field of protein engineering. Such libraries comprise many variations of a single gene in which codon replacements are used to substitute key residues of the encoded protein. The expression of such libraries generates a library of randomised proteins which can subsequently be screened for desired or novel activities. Randomisation in this fashion has predominantly been achieved by the inclusion of the codons NNN or NNGCor T, in which N represents any of the four bases A,C,G, or T. The use of thesis codons however, necessities the cloning of redundant codons at each position of randomisation, in addition to those required to encode the twenty possible amino acid substitutions. As degenerate codons must be included at each position of randomisation, this results in a progressive loss of randomisation efficiency as the number of randomised positions is increased. The ratio of genes to proteins in these libraries rises exponentially with each position of randomisation, creating large gene libraries, which generate protein libraries of limited diversity upon expression. In addition to these problems of library size, the cloning of redundant codons also results in the generation of protein libraries in which substituted amino acids are unevenly represented. As several of the randomised codons may encode the same amino acid, for example serine which is encoded six time using the codon NNN, an inherent bias may be introduced into the resulting protein library during the randomisation procedure. The work outlined here describes the development of a novel randomisation technique aimed at a eliminating codon redundancy from randomised gene libraries, thus addressing the problems of library size and bias, associated with the cloning of redundant codons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was concerned with investigating methods of improving the IOP pulse’s potential as a measure of clinical utility. There were three principal sections to the work. 1. Optimisation of measurement and analysis of the IOP pulse. A literature review, covering the years 1960 – 2002 and other relevant scientific publications, provided a knowledge base on the IOP pulse. Initial studies investigated suitable instrumentation and measurement techniques. Fourier transformation was identified as a promising method of analysing the IOP pulse and this technique was developed. 2. Investigation of ocular and systemic variables that affect IOP pulse measurements In order to recognise clinically important changes in IOP pulse measurement, studies were performed to identify influencing factors. Fourier analysis was tested against traditional parameters in order to assess its ability to detect differences in IOP pulse. In addition, it had been speculated that the waveform components of the IOP pulse contained vascular characteristic analogous to those components found in arterial pulse waves. Validation studies to test this hypothesis were attempted. 3. The nature of the intraocular pressure pulse in health and disease and its relation to systemic cardiovascular variables. Fourier analysis and traditional parameters were applied to the IOP pulse measurements taken on diseased and healthy eyes. Only the derived parameter, pulsatile ocular blood flow (POBF) detected differences in diseased groups. The use of an ocular pressure-volume relationship may have improved the POBF measure’s variance in comparison to the measurement of the pulse’s amplitude or Fourier components. Finally, the importance of the driving force of pulsatile blood flow, the arterial pressure pulse, is highlighted. A method of combining the measurements of pulsatile blood flow and pulsatile blood pressure to create a measure of ocular vascular impedance is described along with its advantages for future studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing a means of predicting tool life has been and continues to be a focus of much research effort. A common experience in attempting to replicate such efforts is an inability to achieve the levels of agreement between theory and practice of the original researcher or to extrapolate the work to different materials or cutting conditions to those originally used. This thesis sets out to examine why most equations or models when replicated do not give good agreements. One reason which was found is that researchers in wear prediction, their predictions are limited because they generally fail to properly identify the nature of wear mechanisms operative in their study. Also they fail to identify or recognise factors having a significant influence on wear such as bar diameter. Also in this research the similarities and differences between the two processes of single point turning and drilling are examined through a series of tests. A literature survey was undertaken in wear and wear prediction. As a result it was found that there was a paucity in information and research in the work of drilling as compared to the turning operation. This was extended to the lack of standards that exist for the drilling operation. One reason for this scarcity in information on drilling is due to the complexity of the drilling and the tool geometry of the drill. In the comparative drilling and turning tests performed in this work, the same tool material; HSS, and similar work material was used in order to eliminate the differences which may occur due to this factor. Results of the tests were evaluated and compared for the two operations and SEM photographs were taken for the chips produced. Specific test results were obtained for the cutting temperatures and forces of the tool. It was found that cutting temperature is influenced by various factors like tool geometry and cutting speed, and the temperature itself influenced the tool wear and wear mechanisms that act on the tool. It was found and proven that bar diameter influences the temperature, a factor not considered previously.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A számviteli szabályozások közötti eltérések oda vezethetnek, hogy ugyanarról a gazdálkodó egység, egyazon időszakáról, akár jelentősen is eltérő, és az adott szabályozási logika alapján helyes pénzügyi kimutatások készülnek. Ezek a sokszor materiális eltérések nehezen magyarázhatók, ha figyelembe vesszük azt a tényt is, hogy a különböző pénzügyi kimutatások mögött azonos gazdasági események húzódnak meg. Az eltérések a számviteli szabályozás különbségeivel magyarázhatók. A cikk feltárja, hogy a számviteli szabályozásnak milyen szintjeit lehet megkülönböztetni, és vizsgálja, hogy milyen tényezők vezettek el oda, hogy az egyes megközelítések akár markánsan eltérő megoldásokat kínálnak egy-egy adott vagyoni elem megjelenítésére és értékelésére, illetve egy tranzakció megjelenítésére. A cikk az egyes eltéréseket indukáló tényezők rendszerezése útján megkísérel deduktív úton becslést adni a számviteli harmonizáció további lehetséges irányaira. ____ Differences among the accounting regulations may lead to material dissimilarities in the financial statements prepared for the same entity and same financial year. These material differences are sometimes hard to explain, given that they shall be based on the same transactions. The differences are explained by the differences in the regulations. The paper explains the possible levels of accounting by-laws and examines – based on the relevant literature – the factors that led to different solutions for the recognition, measurement and disclosure of the same items or transactions. By structuring these factors the article tries to give possible future advances in the harmonization of such systems using a deductive approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigated the influence that receiving instruction in two languages, English and Spanish, had on the performance of students enrolled in the International Studies Program (delayed partial immersion model) of Miami Dade County Public Schools on a standardized test in English, the Stanford Achievement Test, eighth edition, for three of its sections, Reading Comprehension, Mathematics Computations, and Mathematics Applications.^ The performance of the selected IS program/Spanish section cohort of students (N = 55) on the SAT Reading Comprehension, Mathematics Computation, and Mathematics Application along four consecutive years was contrasted with that of a control group of comparable students selected within the same feeder pattern where the IS program is implemented (N = 21). The performance of the group was also compared to the cross-sectional achievement patterns of the school's corresponding feeder pattern, region, and district.^ The research model for the study was a variation of the "causal-comparative" or "ex post facto design" sometimes referred to as "prospective". After data were collected from MDCPS, t-tests were performed to compare IS-Spanish students SAT performance for grades 3 to 6 for years 1994 to 1997 to control group, feeder pattern, region and district norms for each year for Reading Comprehension, Mathematics Computation, and Mathematics Applications. Repeated measures ANOVA and Tukey's tests were calculated to compare the mean percentiles of the groups under study and the possible interactions of the different variables. All tests were performed at the 5% significance level.^ From the analyses of the tests it was deduced that the IS group performed significantly better than the control group for all the three measures along the four years. The IS group mean percentiles on the three measures were also significantly higher than those of the feeder pattern, region, and district. The null hypotheses were rejected and it was concluded that receiving instruction in two languages did not negatively affect the performance of IS program students on tests taken in English. It was also concluded that the particular design the IS program enhances the general performance of participant students on Standardized tests.^ The quantitative analyses were coupled with interviews from teachers and administrators of the IS program to gain additional insight about different aspects of the implementation of the program at each particular school. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lineup procedures have recently garnered extensive empirical attention, in an effort to reduce the number of mistaken identifications that plague the criminal justice system. Relatively little attention, however, has been paid to the influence of the lineup constructor or the lineup construction technique on the quality of the lineup. This study examined whether the cross-race effect has an influence on the quality of lineups constructed using a match-to-suspect or match-to-description technique in a series of three phases. Participants generated descriptions of same- and other-race targets in Phase 1, which were used in Phase 2. In Phase 2, participants were asked to create lineups for own-race targets and other-race targets using one of two techniques. The lineups created in this phase were examined for lineup quality in Phase 3 by calculating lineup fairness assessments through the use of a mock witness paradigm. ^ Overall, the results of these experiment phases suggest that the race of those involved in the lineup construction process influences lineups. There was no difference in witness description accuracy in Phase 1, which ran counter to predictions based on the cross-race effect. The cross-race effect was observed, however, in Phases 2 and 3. The lineup construction technique used also influenced several of the process measures, selection estimates, and fairness judgments in Phase 2. Interestingly, the presence of the cross-race effect was in the opposite direction as predicted for some measures in both phases. In Phase 2, the cross-race effect was as predicted for number of foils viewed, but in the opposite direction for average time spent viewing each foil. In Phase 3, the cross-race effect was in the opposite direction than predicted, with higher levels of lineup fairness in other-race lineups. The practical implications of these findings are discussed in relation to lineup fairness within the legal system. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study measures the increase in serum carotenoid concentration in 30 healthy individuals after supplementation with a low dose xanthophyll ester (3 and 6 mg of lutein equivalent/per day) when compared to a placebo. Serum levels of carotenoids were measured using HPLC and showed an increase in the concentration of lutein, zeaxanthin and four lutein metabolites proportional to dose. In order to further assess the importance of the end-group structure in carotenoids we have investigated the influence of the end-group type and functionality on the conformational energy barrier. We used the density functional method implemented on GAUSSIAN 98 to calculate the conformational energy curves for rotation of the P-ring or the E-ring relative to short polyene chains around the C6-C7 single bond. A large barrier is observed for the interconversion of conformers in the E-rings (8 kcal/mol) when compared to beta rings (2.3-3 kcal/mol).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human cadavers have long been used to teach human anatomy and are increasingly used in other disciplines. Different embalming techniques have been reported in the literature; however there is no clear consensus on the opinion of anatomists on the utility of embalmed cadavers for the teaching of anatomy. To this end, we aimed to survey British and Irish anatomy teachers to report their opinions on different preservation methods for the teaching of anatomy. In this project eight human cadavers were embalmed using formalin, Genelyn, Thiel and Imperial College London- Soft Preserving (ICL-SP) techniques to compare different characteristics of these four techniques. The results of this thesis show that anatomy teachers consider hard-fixed cadavers not to be the most accurate teaching model in comparison to the human body, although it still serves as a useful teaching method (Chapter 2). In addition, our findings confirm that joints of cadavers embalmed using ICL-SP solution faithfully mimics joints of an unembalmed cadaver compared to the other techniques (Chapter 3). Embalming a human body prevents the deterioration in the quality of images and our findings highlight that the influence of the embalming solutions varied with the radiological modality used (Chapter 4). The method developed as part of this thesis enables anatomists and forensic scientists to quantify the decomposition rate of an embalmed human cadaver (Chapter 5). Formalin embalming solution showed the strongest antimicrobial abilities followed by Thiel, Genelyn and finally by ICL-SP (Chapter 6). The overarching viewpoint of this set of studies show that it is inaccurate to state that one embalming technique is ultimately the best. The value of each technique differs based on the requirement of the particular education or research area. Hence we highlight how different embalming techniques may be better suited to certain fields of study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Loraine Leeson shared a panel with Hilary Wainwright and Linda Bellos OBE to speak about the cultural legacy of the Greater London Council. As a former member of the GLC’s Community Arts sub-committee in the 1980s, she drew on this experience to highlight the usefully different model offered by its arts policies to the top-down, target-driven arts funding structures, which are so familiar today. GLC policies led to a different kind of art, and with new life now being breathed into the Labour movement, younger generations are looking to lessons from the past to learn how things can be done differently.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fostering the emergence of a "European identity" was one of the declared goals of the euro adoption. Now, years after the physical introduction of the common currency, we assess whether there has been an effect on a shared European identity. We use two different datasets in order to assess the impact of the euro adoption on the fostering of a self-declared "European Identity". We find that the effect of the euro is statistically insignificant although it is precisely estimated. This result holds important implications for European policy makers. It also sheds new light on the formation of social identities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When we talk of culture we are talking about a source of enormous historical social power. Why did the GLC commitment to Popular Culture upset the rules of the unelected arts and sports quangos? How did the now abolished GLC get politics into arts policy?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been plenty of debate in the academic literature about the nature of the common good or public interest in planning. There is a recognition that the idea is one that is extremely difficult to isolate in practical terms; nevertheless, scholars insist that the idea ‘…remains the pivot around which debates about the nature of planning and its purposes turn’ (Campbell & Marshall, 2002, 163–64). At the point of first principles, these debates have broached political theories of the state and even philosophies of science that inform critiques of rationality, social justice and power. In the planning arena specifically, much of the scholarship has tended to focus on theorising the move from a rational comprehensive planning system in the 1960s and 1970s, to one that is now dominated by deliberative democracy in the form of collaborative planning. In theoretical terms, this debate has been framed by a movement from what are perceived as objective and elitist notions of planning practice and decision-making to ones that are considered (by some) to be ‘inter-subjective’ and non-elitist. Yet despite significant conceptual debate, only a small number of empirical studies have tackled the issue by investigating notions of the common good from the perspective of planning practitioners. What do practitioners understand by the idea of the common good in planning? Do they actively consider it when making planning decisions? Do governance/institutional barriers exist to pursuing the common good in planning? In this paper, these sorts of questions are addressed using the case of Ireland. The methodology consists of a series of semi-structured qualitative interviews with 20 urban planners working across four planning authorities within the Greater Dublin Area, Ireland. The findings show that the most frequently cited definition of the common good is balancing different competing interests and avoiding/minimising the negative effects of development. The results show that practitioner views of the common good are far removed from the lofty ideals of planning theory and reflect the ideological shift of planners within an institution that has been heavily neoliberalised since the 1970s.