918 resultados para Intensive and extensive margin
Resumo:
This thesis aimed to contribute to the discussion about the relationship between agricultural production structure, occupation and poverty in Brazil, specifically in the state of Minas Gerais, in 2010. The issue of employment is becoming increasingly challenging in the face of ongoing modernization process in agriculture, capital intensive and labor saver looking levels ever higher production and productivity. The productive inclusion can be an effective way to exit from poverty (the work is often the only asset of the poor). In this sense, we sought to investigate what activities or groups of activities occupied a larger number of people and generated higher yields and can potentially have contributed to a lower incidence of poverty. The basis for primary data was the 2010 Population Census (microdata). To achieve the objectives we used descriptive analysis, Pearson correlation coefficients and quantile regressions. Among the main findings highlight that agriculture occupied more and generated higher overall income than ranching presented more precarious, despite having lower average incomes and income percentile values, greater heterogeneity and instability, as well as higher proportions of poor. Overall, commodities showed greater formalization and lower poor proportions. In the case of agriculture, commodities activities occupied less, generated lower mass income and middle-income (although income percentiles slightly larger and more informality) and had lower poverty indicators than non-commodity (more heterogeneous rents). In livestock, commodities had higher percentages of occupation, income (although middle-income values and percentiles slightly smaller), and smaller proportions of poor than non-commodity (more heterogenous). In terms of occupation and income stood out the farming activities unspecified (non-commodity), the coffee growing and cattle (commodities). The cultivation of coffee and cattle had the lowest poverty indicators. agricultural production diversification indicators showed positive correlations with the occupation in activities not commodities (only), but also with the proportion of poor, indigent and concentration of income. In addition, the occupation in not commodities showed positive correlations with poverty indicators. It is noteworthy that the occupations in soybeans, coffee and fruit showed negative correlation coefficients with the indicators of poverty, indigence and gini. Finally, among the agricultural activities, there was to go to occupied in agricultural activities not commodities for commodity would be 'more equalizer' (decreasing coefficients over the distribution of income) than for cattle. The occupation in livestock (mostly non-commodity) would generate greater impact on the lower income deciles, but their coefficients grow back in the last deciles, which shows its most perverse character. Among the activities that would affect more strongly the lower deciles and less the higher deciles stand out pig farming, poultry, citrus cultivation, coffee and sugar cane. The cattle and the cultivation of soy, had the highest rates, but they grow back in the last deciles, which shows a more wicked character.
Resumo:
As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.
Resumo:
As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.
Resumo:
Few symbols of 1950s-1960s America remain as central to our contemporary conception of Cold War culture as the iconic ranch-style suburban home. While the house took center stage in the Nixon/Khrushchev kitchen debates as a symbol of modern efficiency and capitalist values, its popularity depended largely upon its obvious appropriation of vernacular architecture from the 19th century, those California haciendas and Texas dogtrots that dotted the American west. Contractors like William Levitt modernized the historical common houses, hermetically sealing their porous construction, all while using the ranch-style roots of the dwelling to galvanize a myth of an indigenous American culture. At a moment of intense occupational bureaucracy, political uncertainty and atomized social life, the rancher gave a self-identifying white consumer base reason to believe they could master their own plot in the expansive frontier. Only one example of America’s mid-century love affair with commodified vernacular forms, the ranch-style home represents a broad effort on the part of corporate and governmental interest groups to transform the vernacular into a style that expresses a distinctly homogenous vision of American culture. “Other than a Citizen” begins with an anatomy of that transformation, and then turns to the work of four poets who sought to reclaim the vernacular from that process of standardization and use it to countermand the containment-era strategies of Cold War America.
In four chapters, I trace references to common speech and verbal expressivity in the poetry and poetic theory of Charles Olson, Robert Duncan, LeRoi Jones/Amiri Baraka and Gwendolyn Brooks, against the historical backdrop of the Free-Speech Movement and the rise of mass-culture. When poets frame nonliterary speech within the literary page, they encounter the inability of writing to capture the vital ephemerality of verbal expression. Rather than treat this limitation as an impediment, the writers in my study use the poem to dramatize the fugitivity of speech, emphasizing it as a disruptive counterpoint to the technologies of capture. Where critics such as Houston Baker interpret the vernacular strictly in terms of resistance, I take a cue from the poets and argue that the vernacular, rooted etymologically at the intersection of domestic security and enslaved margin, represents a gestalt form, capable at once of establishing centralized power and sparking minor protest. My argument also expands upon Michael North’s exploration of the influence of minstrelsy and regionalism on the development of modernist literary technique in The Dialect of Modernism. As he focuses on writers from the early 20th century, I account for the next generation, whose America was not a culturally inferior collection of immigrants but an imperial power, replete with economic, political and artistic dominance. Instead of settling for an essentially American idiom, the poets in my study saw in the vernacular not phonetic misspellings, slang terminology and fragmented syntax, but the potential to provoke and thereby frame a more ethical mode of social life, straining against the regimentation of citizenship.
My attention to the vernacular argues for an alignment among writers who have been segregated by the assumption that race and aesthetics are mutually exclusive categories. In reading these writers alongside one another, “Other than a Citizen” shows how the avant-garde concepts of projective poetics and composition by field develop out of an interest in black expressivity. Conversely, I trace black radicalism and its emphasis on sociality back to the communalism practiced at the experimental arts college in Black Mountain, North Carolina, where Olson and Duncan taught. In pressing for this connection, my work reveals the racial politics embedded within the speech-based aesthetics of the postwar era, while foregrounding the aesthetic dimension of militant protest.
Not unlike today, the popular rhetoric of the Cold War insists that to be a citizen involves defending one’s status as a rightful member of an exclusionary nation. To be other than a citizen, as the poets in my study make clear, begins with eschewing the false certainty that accompanies categorical nominalization. In promoting a model of mutually dependent participation, these poets lay the groundwork for an alternative model of civic belonging, where volition and reciprocity replace compliance and self-sufficiency. In reading their lines, we become all the more aware of the cracks that run the length of our load-bearing walls.
Resumo:
Computational fluid dynamic (CFD) studies of blood flow in cerebrovascular aneurysms have potential to improve patient treatment planning by enabling clinicians and engineers to model patient-specific geometries and compute predictors and risks prior to neurovascular intervention. However, the use of patient-specific computational models in clinical settings is unfeasible due to their complexity, computationally intensive and time-consuming nature. An important factor contributing to this challenge is the choice of outlet boundary conditions, which often involves a trade-off between physiological accuracy, patient-specificity, simplicity and speed. In this study, we analyze how resistance and impedance outlet boundary conditions affect blood flow velocities, wall shear stresses and pressure distributions in a patient-specific model of a cerebrovascular aneurysm. We also use geometrical manipulation techniques to obtain a model of the patient’s vasculature prior to aneurysm development, and study how forces and stresses may have been involved in the initiation of aneurysm growth. Our CFD results show that the nature of the prescribed outlet boundary conditions is not as important as the relative distributions of blood flow through each outlet branch. As long as the appropriate parameters are chosen to keep these flow distributions consistent with physiology, resistance boundary conditions, which are simpler, easier to use and more practical than their impedance counterparts, are sufficient to study aneurysm pathophysiology, since they predict very similar wall shear stresses, time-averaged wall shear stresses, time-averaged pressures, and blood flow patterns and velocities. The only situations where the use of impedance boundary conditions should be prioritized is if pressure waveforms are being analyzed, or if local pressure distributions are being evaluated at specific time points, especially at peak systole, where the use of resistance boundary conditions leads to unnaturally large pressure pulses. In addition, we show that in this specific patient, the region of the blood vessel where the neck of the aneurysm developed was subject to abnormally high wall shear stresses, and that regions surrounding blebs on the aneurysmal surface were subject to low, oscillatory wall shear stresses. Computational models using resistance outlet boundary conditions may be suitable to study patient-specific aneurysm progression in a clinical setting, although several other challenges must be addressed before these tools can be applied clinically.
Resumo:
Tissue-engineered blood vessels (TEBV) can serve as vascular grafts and may also play an important role in the development of organs-on-a-chip. Most TEBV construction involves scaffolding with biomaterials such as collagen gel or electrospun fibrous mesh. Hypothesizing that a scaffold-free TEBV may be advantageous, we constructed a tubular structure (1 mm i.d.) from aligned human mesenchymal cell sheets (hMSC) as the wall and human endothelial progenitor cell (hEPC) coating as the lumen. The burst pressure of the scaffold-free TEBV was above 200 mmHg after three weeks of sequential culture in a rotating wall bioreactor and perfusion at 6.8 dynes/cm(2). The interwoven organization of the cell layers and extensive extracellular matrix (ECM) formation of the hMSC-based TEBV resembled that of native blood vessels. The TEBV exhibited flow-mediated vasodilation, vasoconstriction after exposure to 1 μM phenylephrine and released nitric oxide in a manner similar to that of porcine femoral vein. HL-60 cells attached to the TEBV lumen after TNF-α activation to suggest a functional endothelium. This study demonstrates the potential of a hEPC endothelialized hMSC-based TEBV for drug screening.
Resumo:
The use of DNA as a polymeric building material transcends its function in biology and is exciting in bionanotechnology for applications ranging from biosensing, to diagnostics, and to targeted drug delivery. These applications are enabled by DNA’s unique structural and chemical properties, embodied as a directional polyanion that exhibits molecular recognition capabilities. Hence, the efficient and precise synthesis of high molecular weight DNA materials has become key to advance DNA bionanotechnology. Current synthesis methods largely rely on either solid phase chemical synthesis or template-dependent polymerase amplification. The inherent step-by-step fashion of solid phase synthesis limits the length of the resulting DNA to typically less than 150 nucleotides. In contrast, polymerase based enzymatic synthesis methods (e.g., polymerase chain reaction) are not limited by product length, but require a DNA template to guide the synthesis. Furthermore, advanced DNA bionanotechnology requires tailorable structural and self-assembly properties. Current synthesis methods, however, often involve multiple conjugating reactions and extensive purification steps.
The research described in this dissertation aims to develop a facile method to synthesize high molecular weight, single stranded DNA (or polynucleotide) with versatile functionalities. We exploit the ability of a template-independent DNA polymerase−terminal deoxynucleotidyl transferase (TdT) to catalyze the polymerization of 2’-deoxyribonucleoside 5’-triphosphates (dNTP, monomer) from the 3’-hydroxyl group of an oligodeoxyribonucleotide (initiator). We termed this enzymatic synthesis method: TdT catalyzed enzymatic polymerization, or TcEP.
Specifically, this dissertation is structured to address three specific research aims. With the objective to generate high molecular weight polynucleotides, Specific Aim 1 studies the reaction kinetics of TcEP by investigating the polymerization of 2’-deoxythymidine 5’-triphosphates (monomer) from the 3’-hydroxyl group of oligodeoxyribothymidine (initiator) using in situ 1H NMR and fluorescent gel electrophoresis. We found that TcEP kinetics follows the “living” chain-growth polycondensation mechanism, and like in “living” polymerizations, the molecular weight of the final product is determined by the starting molar ratio of monomer to initiator. The distribution of the molecular weight is crucially influenced by the molar ratio of initiator to TdT. We developed a reaction kinetics model that allows us to quantitatively describe the reaction and predict the molecular weight of the reaction products.
Specific Aim 2 further explores TcEP’s ability to transcend homo-polynucleotide synthesis by varying the choices of initiators and monomers. We investigated the effects of initiator length and sequence on TcEP, and found that the minimum length of an effective initiator should be 10 nucleotides and that the formation of secondary structures close to the 3’-hydroxyl group can impede the polymerization reaction. We also demonstrated TcEP’s capacity to incorporate a wide range of unnatural dNTPs into the growing chain, such as, hydrophobic fluorescent dNTP and fluoro modified dNTP. By harnessing the encoded nucleotide sequence of an initiator and the chemical diversity of monomers, TcEP enables us to introduce molecular recognition capabilities and chemical functionalities on the 5’-terminus and 3’-terminus, respectively.
Building on TcEP’s synthesis capacities, in Specific Aim 3 we invented a two-step strategy to synthesize diblock amphiphilic polynucleotides, in which the first, hydrophilic block serves as a macro-initiator for the growth of the second block, comprised of natural and/or unnatural nucleotides. By tuning the hydrophilic length, we synthesized the amphiphilic diblock polynucleotides that can self-assemble into micellar structures ranging from star-like to crew-cut morphologies. The observed self-assembly behaviors agree with predictions from dissipative particle dynamics simulations as well as scaling law for polyelectrolyte block copolymers.
In summary, we developed an enzymatic synthesis method (i.e., TcEP) that enables the facile synthesis of high molecular weight polynucleotides with low polydispersity. Although we can control the nucleotide sequence only to a limited extent, TcEP offers a method to integrate an oligodeoxyribonucleotide with specific sequence at the 5’-terminus and to incorporate functional groups along the growing chains simultaneously. Additionally, we used TcEP to synthesize amphiphilic polynucleotides that display self-assemble ability. We anticipate that our facile synthesis method will not only advance molecular biology, but also invigorate materials science and bionanotechnology.
Resumo:
L’objectif principal de cette thèse était de créer, d’implanter et d’évaluer l’efficacité d’un programme de remédiation cognitive, intervenant de façon comparable sur les aspects fluide (Gf) et cristallisé (Gc) de l’intelligence, au sein d’une population d’intérêt clinique, les adolescents présentant un fonctionnement intellectuel limite (FIL). Compte tenu de la forte prévalence de ce trouble, le programme de remédiation GAME (Gains et Apprentissages Multiples pour Enfant) s’est développé autour de jeux disponibles dans le commerce afin de faciliter l’accès et l’implantation de ce programme dans divers milieux.
Le premier article de cette thèse, réalisé sous forme de revue systématique de la littérature, avait pour objectif de faire le point sur les études publiées utilisant le jeu comme outil de remédiation cognitive dans la population pédiatrique. L’efficacité, ainsi que la qualité du paradigme utilisé ont été évaluées, et des recommandations sur les aspects méthodologiques à respecter lors de ce type d’étude ont été proposées. Cet article a permis une meilleure compréhension des écueils à éviter et des points forts méthodologiques à intégrer lors de la création du programme de remédiation GAME. Certaines mises en garde méthodologiques relevées dans cet article ont permis d’améliorer la qualité du programme de remédiation cognitive développé dans ce projet de thèse.
Compte tenu du peu d’études présentes dans la littérature scientifique concernant la population présentant un FIL (70
Resumo:
Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.
Resumo:
Interstitial water analyses made at 12 sites during Leg 117 are used to define the nature of diagenetic reactions in organic-rich sediments on the Owen Ridge and Oman Margin. Minor variations in chloride concentration profiles are ascribed to past changes in bottom water salinity at two mid-depth margin sites and to upward migration of low salinity water at another. There is no evidence for subsurface brine movement, unlike the case on the Peru Margin. Dolomitization is widespread and accounts for the depletions of magnesium observed in pore waters at variable depths at nearly all sites. The mineral occurs both as disseminated euhedral limpid crystals and, in at least one location, in massive stringers. Formation of the latter is suggested to reflect precipitation during sea level transgressions when the sedimentation rate was low, but when productivity was high. Authigenic carbonate fluorapatite is also widespread, the phosphorus being derived from the breakdown of organic matter. Sulfate is quantitatively depleted at depth at most locations but the rate of depletion is markedly less than that observed on the Peru Margin where sedimentation is also similarly influenced by high rates of upwelling. The reason for this contrast is not clear and merits further investigation.
Resumo:
Aerial observations of light pollution can fill an important gap between ground based surveys and nighttime satellite data. Terrestrially bound surveys are labor intensive and are generally limited to a small spatial extent, and while existing satellite data cover the whole world, they are limited to coarse resolution. This paper describes the production of a high resolution (1 m) mosaic image of the city of Berlin, Germany at night. The dataset is spatially analyzed to identify themajor sources of light pollution in the city based on urban land use data. An area-independent 'brightness factor' is introduced that allows direct comparison of the light emission from differently sized land use classes, and the percentage area with values above average brightness is calculated for each class. Using this methodology, lighting associated with streets has been found to be the dominant source of zenith directed light pollution (31.6%), although other land use classes have much higher average brightness. These results are compared with other urban light pollution quantification studies. The minimum resolution required for an analysis of this type is found to be near 10 m. Future applications of high resolution datasets such as this one could include: studies of the efficacy of light pollution mitigation measures, improved light pollution simulations, economic and energy use, the relationship between artificial light and ecological parameters (e.g. circadian rhythm, fitness, mate selection, species distributions, migration barriers and seasonal behavior), or the management of nightscapes. To encourage further scientific inquiry, the mosaic data is freely available at Pangaea.
Resumo:
This paper highlights for the first time a full comprehension of the deformation procedure during the injection stretch blow moulding (ISBM) process of poly(ethylene terephthalate) (PET) containers, namely thin-walled rigid bottles. The processes required to form PET bottles are complicated and extensive; any development in understanding the nature of material deformation can potentially improve the bottle optimisation process. Removing the bottle mould and performing free-stretch-blow (FSB) experiments revealed insight into the bottle forming characteristics at various preform temperatures and blowing rates. Process outputs cavity pressure and stretch-rod force were recorded using at instrumented stretch-rod and preform surface strain mapping was determined using a combination of a unique patterning procedure and high speed stereoscopic digital image correlation. The unprecedented experimental analysis reveals that the deformation behaviour varies considerably with contrasting process input parameters. Investigation into the effect on deformation mode, strain rate and final bottle shape provide a basis for full understanding of the process optimisation and therefore how the process inputs may aid development of the preferred optimised container.
Resumo:
Procedural justice advocates argue that fair procedures in decision making processes can increase participant satisfaction with legal institutions. Little critical work has been done however to explore the power of such claims in the context of mass violence and international criminal justice. This article critically examines some of the key claims of procedural justice by exploring the perceptions of justice held by victims participating as Civil Parties in the Extraordinary Chambers in the Courts of Cambodia (ECCC). The ECCC has created one of the most inclusive and extensive victim participation regimes within international criminal law. It therefore provides a unique case study to examine some of claims of ‘victim-centred’ transitional justice through a procedural justice lens. It finds that while procedural justice influenced civil parties’ overall perceptions of the Court, outcomes remained of primary importance. It concludes by analysing the possible reasons for this prioritisation.
Resumo:
Ageing and deterioration of infrastructure is a challenge facing transport authorities. In particular, there is a need for increased bridge monitoring in order to provide adequate maintenance, prioritise allocation of funds and guarantee acceptable levels of transport safety. Existing bridge structural health monitoring (SHM) techniques typically involve direct instrumentation of the bridge with sensors and equipment for the measurement of properties such as frequencies of vibration. These techniques are important as they can indicate the deterioration of the bridge condition. However, they can be labour intensive and expensive due to the requirement for on-site installations. In recent years, alternative low-cost indirect vibrationbased SHM approaches have been proposed which utilise the dynamic response of a vehicle to carry out “drive-by” pavement and/or bridge monitoring. The vehicle is fitted with sensors on its axles thus reducing the need for on-site installations. This paper investigates the use of low-cost sensors incorporating global navigation satellite systems (GNSS) for implementation of the drive-by system in practice, via field trials with an instrumented vehicle. The potential of smartphone technology to be harnessed for drive by monitoring is established, while smartphone GNSS tracking applications are found to compare favourably in terms of accuracy, cost and ease of use to professional GNSS devices.
Resumo:
Bakgrund: Vilken roll har företag i vårt samhälle? Vilken funktion ska de fylla? Är företagens funktion att vara vinstmaximerande och enbart se till sitt eget bästa, eller har de ett större ansvar och skyldigheter mot samhället? Dessa frågor har diskuterats under lång tid och bilden av företag och företagande förändras kontinuerligt i takt med att samhället förändras. Tankarna om att företag har ett socialt ansvar, vid sidan av det ekonomiska, har spridit sig över världen och frågor som har dykt upp är om det finns någon motsättning mellan socialt ansvar och företagens vinstintresse. Eller kan det vara så att socialt ansvarstagande kan leda till ökad lönsamhet? Syfte: Att genom en empirisk undersökning ge en förklaring om svenska noterade bolags rapporterade arbete med CSR har en positiv inverkan på dess lönsamhet. Metod: För att uppnå syftet valdes en deduktiv kvantitativ metod för att kunna göra en statistisk generalisering. Det rapporterade CSR-arbetet operationaliseras med hjälp av Folksams rapport "Index för ansvarsfullt företagande" och lönsamhet mäts via avkastning på totalt kapital (ROA) samt vinstmarginal. Analysen genomförs med hjälp av multipla regressionsanalyser. Slutsats: Studiens resultat visar att företags rapporterade CSR-arbete har en positiv inverkan på svenska noterade företags lönsamhet, både mätt i avkastning på totalt kapital (ROA) och vinstmarginal.