967 resultados para Modèle non-standard


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This layer is a georeferenced raster image of the historic paper map entitled: Pacific Ocean : compiled from Admiralty surveys & other official sources by the India-Rubber, Gutta-Percha & Telegraph Works Co. It was published by J.D. Potter in [1899]. Scale [ca. 1:15,000,000]. The image inside the map neatline is georeferenced to the surface of the earth and fit to a non-standard 'Mercator' projection with the central meridian at 170 degrees west. All map collar and inset information is also available as part of the raster image, including any inset maps, profiles, statistical tables, directories, text, illustrations, index maps, legends, or other information associated with the principal map. Note: The central meridian of this map is not the same as the Prime Meridian and may wrap the International Date Line or overlap itself when displayed in GIS software. This map shows features such as drainage, territorial boundaries, shoreline features, and more. Relief shown by hachures. Depths shown by soundings. Shows routes of Admiralty surveys. This layer is part of a selection of digitally scanned and georeferenced historic maps from the Harvard Map Collection and the Harvard University Library as part of the Open Collections Program at Harvard University project: Organizing Our World: Sponsored Exploration and Scientific Discovery in the Modern Age. Maps selected for the project correspond to various expeditions and represent a range of regions, originators, ground condition dates, scales, and purposes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This layer is a georeferenced raster image of the historic paper map entitled: Behring's Sea and Arctic Ocean : from surveys of the U.S. North Pacific Surveying Expedition in 1855, Commander John Rodgers U.S.N. commanding and from Russian and English authorities, J.C.P. de Kraft, commodore U.S.N. Hydrographer to the Bureau of Navigation ; compiled by E.R. Knorr ; drawn by Louis Waldecker. Corr. & additions to Jan. 1882. It was published by U.S. Navy, Hydrographic Office in 1882. Scale [ca. 1:4,400,000]. Covers the Bering Sea and Arctic Ocean region. The image inside the map neatline is georeferenced to the surface of the earth and fit to a non-standard 'Mercator' projection with the central meridian at 180 degrees west. All map collar and inset information is also available as part of the raster image, including any inset maps, profiles, statistical tables, directories, text, illustrations, index maps, legends, or other information associated with the principal map. Note: The central meridian of this map is not the same as the Prime Meridian and may wrap the International Date Line or overlap itself when displayed in GIS software. This map shows features such as drainage, cities and other human settlements, territorial boundaries, expedition routes, shoreline features, bays, harbors, islands, rocks, and more. Relief shown by hachures and spot heights. Depths shown by soundings. Includes drawing of Wrangel Island "as seen from Bark Nile of New London ... ; 15 to 18 miles distant". This layer is part of a selection of digitally scanned and georeferenced historic maps from the Harvard Map Collection and the Harvard University Library as part of the Open Collections Program at Harvard University project: Organizing Our World: Sponsored Exploration and Scientific Discovery in the Modern Age. Maps selected for the project correspond to various expeditions and represent a range of regions, originators, ground condition dates, scales, and purposes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wood is a natural and traditional building material, as popular today as ever, and presents advantages. Physically, wood is strong and stiff, but compared with other materials like steel is light and flexible. Wood material can absorb sound very effectively and it is a relatively good heat insulator. But dry wood burns quite easily and produces a great deal of heat energy. The main disadvantage is the high level of combustion when exposed to fire, where the point at which it catches fire is from 200–400°C. After fire exposure, is need to determine if the charred wooden structures are safe for future use. Design methods require the use of computer modelling to predict the fire exposure and the capacity of structures to resist those action. Also, large or small scale experimental tests are necessary to calibrate and verify the numerical models. The thermal model is essential for wood structures exposed to fire, because predicts the charring rate as a function of fire exposure. The charring rate calculation of most structural wood elements allows simple calculations, but is more complicated for situations where the fire exposure is non-standard and in wood elements protected with other materials. In this work, the authors present different case studies using numerical models, that will help professionals analysing woods elements and the type of information needed to decide whether the charred structures are adequate or not to use. Different thermal models representing wooden cellular slabs, used in building construction for ceiling or flooring compartments, will be analysed and submitted to different fire scenarios (with the standard fire curve exposure). The same numerical models, considering insulation material inside the wooden cellular slabs, will be tested to compare and determine the fire time resistance and the charring rate calculation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wood is a natural and traditional building material, as popular today as ever, and presents advantages. Physically, wood is strong and stiff, but compared with other materiais like steel is light and flexible. Wood material can absorb sound very effectively and it is a relatively good heat insulator. But dry wood does bum quite easily md produces a great deal ofheat energy. The main disadvantage is the high levei ofcombustion when exposed to fíre, where the point at which it catches fire is fi-om 200-400°C. After fu-e exposure, is need to determine if the charred wooden stmctures are safe for future use. Design methods require the use ofcomputer modelling to predict the fíre exposure and the capacity ofstructures to resist fhose action. Also, large or small scale experimental tests are necessary to calibrate and verify the numerical models. The thermal model is essential for wood stmctures exposed to fire, because predicts the charring rate as a fünction offire exposure. The charring rate calculation ofmost stmctural wood elements allows simple calculations, but is more complicated for situations where the fire exposure is non-standard and in wood elements protected with other materiais.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

After advocating flexibilization of non-standard work contracts for many years, some European and international institutions and several policy makers now indicate the standard employment relationship and its regulation as a cause of segmentation between the labour market of "guaranteed" insiders, employed under permanent contracts with effective protection against unfair dismissal, and the market of the “not-guaranteed” outsiders, working with non-standard contracts. Reforms of employment legislation are therefore being promoted and approved in different countries, allegedly aiming to balance the legal protection afforded to standard and non-standard workers. This article firstly argues that this approach is flawed as it oversimplifies reasons of segmentation as it concentrates on an “insiders-outsiders” discourse that cannot easily be transplanted in continental Europe. After reviewing current legislative changes in Italy, Spain and Portugal, it is then argued that lawmakers are focused on “deregulation” rather than “balancing protection” when approving recent reforms. Finally, the mainstream approach to segmentation and some of its derivative proposals, such as calls to introduce a “single permanent contract”, are called into question, as they seem to neglect the essential role of job protection in underpinning the effectiveness of fundamental and constitutional rights at the workplace.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper explores the effects of non-standard monetary policies on international yield relationships. Based on a descriptive analysis of international long-term yields, we find evidence that long-term rates followed a global downward trend prior to as well as during the financial crisis. Comparing interest rate developments in the US and the eurozone, it is difficult to detect a distinct impact of the first round of the Fed’s quantitative easing programme (QE1) on US interest rates for which the global environment – the global downward trend in interest rates – does not account. Motivated by these findings, we analyse the impact of the Fed’s QE1 programme on the stability of the US-euro long-term interest rate relationship by using a CVAR (cointegrated vector autoregressive) model and, in particular, recursive estimation methods. Using data gathered between 2002 and 2014, we find limited evidence that QE1 caused the break-up or destabilised the transatlantic interest rate relationship. Taking global interest rate developments into account, we thus find no significant evidence that QE had any independent, distinct impact on US interest rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This presentation is about the inside story of the PhD project El malagueño real, mental y virtual. Configuración de los significados sociales de una variedad urbana in Hispanic Linguistics. That is, the production and perception of the Spanish spoken in the city of Malaga and used on the social network sites Facebook and Tuenti by users from Malaga is analysed. Actually, the southern Spanish variety in question is quite distinct from the national standard in terms of its phonetic features, its prestige, and the attitudes to it. Thus, the project started with the initial interest in «Why do people often communicate in very “strange” ways on social media» which then slightly changed to the final research interest in «What do the different non-standard variants mean in virtual (and real) malagueño?». This long – sometimes hazardous, yet mostly fun – process is exposed in more detail by looking at the research questions, the methods and results. Lastly, the presentation concludes with some lessons learnt and an outlook on possibilities and necessities for further investigation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Spanish spoken in the city of Malaga, as Andalusian Spanish in general, was in the past often times considered an incorrect, low prestige variety of Spanish which was strongly associated with the poor, rural, backward South of Spain. This southern Spanish variety is easily recognised because of its innovative phonetic features that diverge from the national standard, even though in the past years in the case of some features a convergence to the standard could be observed. Despite its low prestige the local variety of Spanish is quite often used on social network sites, where it is considered as urban, fashion and cool. Thus, this paper aims at analysing whether the Spanish used in the city of Malaga is undergoing an attitude change. The study draws on naturally occurring speech, data extracted from Facebook and a series of questionnaires about the salience, attitude and perception of the local variety of Spanish. The influence of the social factors age and gender is analysed, since they are both known to play a crucial role in many instances of language change. The first is of special interest, as during the Franco dictatorship dialect use was not accepted in schools and in the media. Results show that, on the one hand, people from Malaga hold a more positive attitude towards non-standard features used on social network sites than in spoken language. On the other hand, young female users employ most non-standard features online and unsurprisingly have an extremely positive attitude towards this use. However, in spoken Spanish the use and attitude of some features is led by men and speakers educated during the Franco dictatorship, while other features, such as elision of intervocalic /d/, elision of final /ɾ/, /l/ and /d/ and ceceo, are predominantly employed by and younger speakers and women. These features are considered as salient in the local variety and work as local identity markers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The international perspectives on these issues are especially valuable in an increasingly connected, but still institutionally and administratively diverse world. The research addressed in several chapters in this volume includes issues around technical standards bodies like EpiDoc and the TEI, engaging with ways these standards are implemented, documented, taught, used in the process of transcribing and annotating texts, and used to generate publications and as the basis for advanced textual or corpus research. Other chapters focus on various aspects of philological research and content creation, including collaborative or community driven efforts, and the issues surrounding editorial oversight, curation, maintenance and sustainability of these resources. Research into the ancient languages and linguistics, in particular Greek, and the language teaching that is a staple of our discipline, are also discussed in several chapters, in particular for ways in which advanced research methods can lead into language technologies and vice versa and ways in which the skills around teaching can be used for public engagement, and vice versa. A common thread through much of the volume is the importance of open access publication or open source development and distribution of texts, materials, tools and standards, both because of the public good provided by such models (circulating materials often already paid for out of the public purse), and the ability to reach non-standard audiences, those who cannot access rich university libraries or afford expensive print volumes. Linked Open Data is another technology that results in wide and free distribution of structured information both within and outside academic circles, and several chapters present academic work that includes ontologies and RDF, either as a direct research output or as essential part of the communication and knowledge representation. Several chapters focus not on the literary and philological side of classics, but on the study of cultural heritage, archaeology, and the material supports on which original textual and artistic material are engraved or otherwise inscribed, addressing both the capture and analysis of artefacts in both 2D and 3D, the representation of data through archaeological standards, and the importance of sharing information and expertise between the several domains both within and without academia that study, record and conserve ancient objects. Almost without exception, the authors reflect on the issues of interdisciplinarity and collaboration, the relationship between their research practice and teaching and/or communication with a wider public, and the importance of the role of the academic researcher in contemporary society and in the context of cutting edge technologies. How research is communicated in a world of instant- access blogging and 140-character micromessaging, and how our expectations of the media affect not only how we publish but how we conduct our research, are questions about which all scholars need to be aware and self-critical.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The mammalian transcriptome harbours shadowy entities that resist classification and analysis. In analogy with pseudogenes, we define pseudo-messenger RNA to be RNA molecules that resemble protein- coding mRNA, but cannot encode full-length proteins owing to disruptions of the reading frame. Using a rigorous computational pipeline, which rules out sequencing errors, we identify 10,679 pseudo - messenger RNAs ( approximately half of which are transposonassociated) among the 102,801 FANTOM3 mouse cDNAs: just over 10% of the FANTOM3 transcriptome. These comprise not only transcribed pseudogenes, but also disrupted splice variants of otherwise protein- coding genes. Some may encode truncated proteins, only a minority of which appear subject to nonsense- mediated decay. The presence of an excess of transcripts whose only disruptions are opal stop codons suggests that there are more selenoproteins than currently estimated. We also describe compensatory frameshifts, where a segment of the gene has changed frame but remains translatable. In summary, we survey a large class of non- standard but potentially functional transcripts that are likely to encode genetic information and effect biological processes in novel ways. Many of these transcripts do not correspond cleanly to any identifiable object in the genome, implying fundamental limits to the goal of annotating all functional elements at the genome sequence level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Représentations littéraires de la langue non standard chez Arthur Masson (Wallonie), Fonson & Wicheler (Bruxelles) et Pagnol (Marseille)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A generalized systematic description of the Two-Wave Mixing (TWM) process in sillenite crystals allowing for arbitrary orientation of the grating vector is presented. An analytical expression for the TWM gain is obtained for the special case of plane waves in a thin crystal (|g|d«1) with large optical activity (|g|/?«1, g is the coupling constant, ? the rotatory power, d the crystal thickness). Using a two-dimensional formulation the scope of the nonlinear equations describing TWM can be extended to finite beams in arbitrary geometries and to any crystal parameters. Two promising applications of this formulation are proposed. The polarization dependence of the TWM gain is used for the flattening of Gaussian beam profiles without expanding them. The dependence of the TWM gain on the interaction length is used for the determination of the crystal orientation. Experiments carried out on Bi12GeO20 crystals of a non-standard cut are in good agreement with the results of modelling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The.use of high-chromium cast irons for abrasive wear resistance is restricted due to their poor fracture toughness properties. An.attempt was made to improve the fracture characteristics by altering the distribution, size and.shape of the eutectic carbide phase without sacrificing their excellent wear resistance. This was achieved by additions of molybdenum or tungsten followed by high temperature heat treatments. The absence of these alloying elements or replacement of them with vanadium or manganese did not show any significant effect and the continuous eutectic carbide morphology remained the same after application of high temperature heat treatments. The fracture characteristics of the alloys with these metallurgical variables were evaluated for both sharp-cracks and blunt notches. The results were used in conjunction with metallographic and fractographic observations to establish possible failure mechanisms. The fracture mechanism of the austenitic alloys was found to be controlled not only by the volume percent but was also greatly influenced by the size and distribution of the eutectic carbides. On the other hand, the fracture mechanism of martensitic alloys was independent of the eutectic carbide morphology. The uniformity of the secondary carbide precipitation during hardening heat treatments was shown to be a reason for consistant fracture toughness results being obtained with this series of alloys although their eutectic carbide morphologies were different. The collected data were applied to a model which incorporated the microstructural parameters and correlated them with the experimentally obtained valid stress intensity factors. The stress intensity coefficients of different short-bar fracture toughness test specimens were evaluated from analytical and experimental compliance studies. The.validity and applicability of this non-standard testing technique for determination of the fracture toughness of high-chromium cast irons were investigated. The results obtained correlated well with the valid results obtained from standard fracture toughness tests.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examined the use of non-standard parameters to investigate the visual field, with particular reference to the detection of glaucomatous visual field loss. Evaluation of the new perimetric strategy for threshold estimation - FASTPAC, demonstrated a reduction in the examination time of normals compared to the standard strategy. Despite an increased within-test variability the FASTPAC strategy produced a similar mean sensitivity to the standard strategy, reducing the effects of patient fatigue. The new technique of Blue-Yellow perimetry was compared to White-White perimetry for the detection of glaucomatous field loss in OHT and POAG. Using a database of normal subjects, confidence limits for normality were constructed to account for the increased between-subject variability with increase in age and eccentricity and for the greater variability of the Blue-Yellow field compared to the White-White field. Effects of individual ocular media absorption had little effect on Blue-Yellow field variability. Total and pattern probability analysis revealed five of 27 OHTs to exhibit Blue-Yellow focal abnormalities; two of these patients subsequently developed White-White loss. Twelve of the 24 POAGs revealed wider and/or deeper Blue-Yellow loss compared with the White-White field. Blue-Yellow perimetry showed good sensitivity and specificity characteristics, however, lack of perimetric experience and the presence of cataract influenced the Blue-Yellow visual field and may confound the interpretation of Blue-Yellow visual field loss. Visual field indices demonstrated a moderate relationship to the structural parameters of the optic nerve head using scanning laser tomography. No abnormalities in Blue-Yellow or Red-Green colour CS was apparent for the OHT patients. A greater vulnerability of the SWS pathway in glaucoma was demonstrated using Blue-Yellow perimetry however predicting which patients may benefit from B-Y perimetric examination is difficult. Furthermore, cataract and the extent of the field loss may limit the extent to which the integrity of the SWS channels can be selectively examined.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Citation information: Armstrong RA, Davies LN, Dunne MCM & Gilmartin B. Statistical guidelines for clinical studies of human vision. Ophthalmic Physiol Opt 2011, 31, 123-136. doi: 10.1111/j.1475-1313.2010.00815.x ABSTRACT: Statistical analysis of data can be complex and different statisticians may disagree as to the correct approach leading to conflict between authors, editors, and reviewers. The objective of this article is to provide some statistical advice for contributors to optometric and ophthalmic journals, to provide advice specifically relevant to clinical studies of human vision, and to recommend statistical analyses that could be used in a variety of circumstances. In submitting an article, in which quantitative data are reported, authors should describe clearly the statistical procedures that they have used and to justify each stage of the analysis. This is especially important if more complex or 'non-standard' analyses have been carried out. The article begins with some general comments relating to data analysis concerning sample size and 'power', hypothesis testing, parametric and non-parametric variables, 'bootstrap methods', one and two-tail testing, and the Bonferroni correction. More specific advice is then given with reference to particular statistical procedures that can be used on a variety of types of data. Where relevant, examples of correct statistical practice are given with reference to recently published articles in the optometric and ophthalmic literature.