953 resultados para Height Tolerance of Concrete Blocks


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yolk color and egg white (albumen) cleanliness and viscosity are important parameters by which consumers judge the quality of eggs. This study aimed to investigate changes in albumen viscosity during storage of eggs for up to 36 days from two different commercial laying hen strains (Carijo Barbada and Isa Brown) fed a diet containing annatto (1.5 and 2.0%) or a synthetic additive without synthetic colorants (control). Analyses of humidity, albumen height, pH, viscosity, foam formation, and stability were carried out on eggs. Carijo Barbada strain had smaller albumen, lower humidity and higher egg white viscosity than Isa Brown strain; however, with storage, viscosity lowered significantly on both strains. Initially, the addition of 2.0% of annatto or a synthetic additive increased viscosity in both strains, but with storage only the control maintained longer viscosity. Lower viscosity did not change foam density and stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to partially characterize some genes involved in the desiccation tolerance of the embryonic axis of Melanoxylon brauna seeds subjected, or not, to oven fast-drying. Seeds were initially dried rapidly in an oven at 40 ºC, 50 ºC, 60 ºC, 70 ºC, and 80 °C, for 24, 48 and 72 h and then subjected to germination tests and moisture content determination. Degenerate primers were designed for 19 genes. The CDNA was used as a template for PCR amplifications using the degenerate primers, and the PCR products obtained were purified, cloned and sequenced. The seeds showed a gradual reduction in percent germination with increasing temperature and drying time. Nucleotide sequences of the cloned fragments related to genes CAT1, SPS1, Abi5, Transk and PM25 were obtained. The similarity analysis with the sequences deposited in databases revealed similarities with genes CAT1, SPS1, Transk and PM25 from other plant species. The nucleotide sequences obtained from the respective genes will be used for designing specific primers for gene expression analyses during seed germination in order to understand the causes for loss of physiological quality of Melanoxylon brauna seeds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydrogen (H2) fuel cells have been considered a promising renewable energy source. The recent growth of H2 economy has required highly sensitive, micro-sized and cost-effective H2 sensor for monitoring concentrations and alerting to leakages due to the flammability and explosiveness of H2 Titanium dioxide (TiO2) made by electrochemical anodic oxidation has shown great potential as a H2 sensing material. The aim of this thesis is to develop highly sensitive H2 sensor using anodized TiO2. The sensor enables mass production and integration with microelectronics by preparing the oxide layer on suitable substrate. Morphology, elemental composition, crystal phase, electrical properties and H2 sensing properties of TiO2 nanostructures prepared on Ti foil, Si and SiO2/Si substrates were characterized. Initially, vertically oriented TiO2 nanotubes as the sensing material were obtained by anodizing Ti foil. The morphological properties of tubes could be tailored by varying the applied voltages of the anodization. The transparent oxide layer creates an interference color phenomena with white light illumination on the oxide surface. This coloration effect can be used to predict the morphological properties of the TiO2 nanostructures. The crystal phase transition from amorphous to anatase or rutile, or the mixture of anatase and rutile was observed with varying heat treatment temperatures. However, the H2 sensing properties of TiO2 nanotubes at room temperature were insufficient. H2 sensors using TiO2 nanostructures formed on Si and SiO2/Si substrates were demonstrated. In both cases, a Ti layer deposited on the substrates by a DC magnetron sputtering method was successfully anodized. A mesoporous TiO2 layer obtained on Si by anodization in an aqueous electrolyte at 5°C showed diode behavior, which was influenced by the work function difference of Pt metal electrodes and the oxide layer. The sensor enabled the detection of H2 (20-1000 ppm) at low operating temperatures (50–140°C) in ambient air. A Pd decorated tubular TiO2 layer was prepared on metal electrodes patterned SiO2/Si wafer by anodization in an organic electrolyte at 5°C. The sensor showed significantly enhanced H2 sensing properties, and detected hydrogen in the range of a few ppm with fast response/recovery time. The metal electrodes placed under the oxide layer also enhanced the mechanical tolerance of the sensor. The concept of TiO2 nanostructures on alternative substrates could be a prospect for microelectronic applications and mass production of gas sensors. The gas sensor properties can be further improved by modifying material morphologies and decorating it with catalytic materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Violence has always been a part of the human experience, and therefore, a popular topic for research. It is a controversial issue, mostly because the possible sources of violent behaviour are so varied, encompassing both biological and environmental factors. However, very little disagreement is found regarding the severity of this societal problem. Most researchers agree that the number and intensity of aggressive acts among adults and children is growing. Not surprisingly, many educational policies, programs, and curricula have been developed to address this concern. The research favours programs which address the root causes of violence and seek to prevent rather than provide consequences for the undesirable behaviour. But what makes a violence prevention program effective? How should educators choose among the many curricula on the market? After reviewing the literature surrounding violence prevention programs and their effectiveness, The Second Step Violence Prevention Curriculum surfaced as unique in many ways. It was designed to address the root causes of violence in an active, student-centred way. Empathy training, anger management, interpersonal cognitive problem solving, and behavioural social skills form the basis of this program. Published in 1992, the program has been the topic of limited research, almost entirely carried out using quantitative methodologies.The purpose of this study was to understand what happens when the Second Step Violence Prevention Curriculum is implemented with a group of students and teachers. I was not seeking a statistical correlation between the frequency of violence and program delivery, as in most prior research. Rather, I wished to gain a deeper understanding of the impact ofthe program through the eyes of the participants. The Second Step Program was taught to a small, primary level, general learning disabilities class by a teacher and student teacher. Data were gathered using interviews with the teachers, personal observations, staff reports, and my own journal. Common themes across the four types of data collection emerged during the study, and these themes were isolated and explored for meaning. Findings indicate that the program does not offer a "quick fix" to this serious problem. However, several important discoveries were made. The teachers feU that the program was effective despite a lack of concrete evidence to support this claim. They used the Second Step strategies outside their actual instructional time and felt it made them better educators and disciplinarians. The students did not display a marked change in their behaviour during or after the program implementation, but they were better able to speak about their actions, the source of their aggression, and the alternatives which were available. Although they were not yet transferring their knowledge into positive action,a heightened awareness was evident. Finally, staff reports and my own journal led me to a deeper understanding ofhow perception frames reality. The perception that the program was working led everyone to feel more empowered when a violent incident occurred, and efforts were made to address the cause rather than merely to offer consequences. A general feeling that we were addressing the problem in a productive way was prevalent among the staff and students involved. The findings from this investigation have many implications for research and practice. Further study into the realm of violence prevention is greatly needed, using a balance of quantitative and qualitative methodologies. Such a serious problem can only be effectively addressed with a greater understanding of its complexities. This study also demonstrates the overall positive impact of the Second Step Violence Prevention Curriculum and, therefore, supports its continued use in our schools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rocks correlated with the Hough Lake and Quirke Lake Groups of the Huronian Supergroup form part of a northeasterly trending corridor that separates 1750 Ma granitic intrusive rocks of the Chief Lake batholith from the 1850 Ma mafic intrusive rocks of the Sudbury Igneous Complex. This corridor is dissected by two major structural features; the Murray Fault Zone (MFZ) and the Long Lake Fault (LLF). Detailed structural mapping and microstructural analysis indicates that the LLF, which has juxtaposed Huronian rocks of different deformation style and metamorphism grade, was a more significant plane of dislocation than the MFZ. The sense of displacement along the LLF is high angle reverse in which rocks to the southeast have been raised relative to those in the northwest. South of the LLF Huronian rocks underwent ductile defonnation at amphibolite facies conditions. The strain was constrictional, defined by a triaxial strain ellipsoid in which X > Y > z. Calculations of a regional k value were approximately 1.3. Penetrative ductile defonnation resulted in the development of a preferred crystallographic orientation in quartz as well as the elongation of quartz grains to fonn a regional southeast-northwest trending, subvertical lineation. Similar lithologies north of the LLF underwent dominantly brittle deformation under greenschist facies conditions. Deformation north of the LLF is characterized by the thrusting of structural blocks to form angular discordances in bedding orientation which were previously interpreted as folds. Ductile deformation occurred between 1750 and 1238 Ma and is correlated with a regional period of south over north reverse faulting that effected much of the southern Sudbury region. Post dating the reverse faulting event was a period of sedimentation as a conglomerate unit was deposited on vertically bedded Huronian rocks. Rocks in the study area were intruded by both mafic and felsic dykes. The 1238 Ma mafic dykes appear to have been offset during a period of dextral strike slip displacement along the major fault'). Indirect evidence indicates that this event occurred after the thrusting at 950 to 1100 Ma associated with the Grenvillian Orogeny.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydrogen bond assisted alkylation of phenols is compared with the classical base assisted reactions. The influence of solvents on the fluoride assisted reactions is discussed,· with emphasis on the localization of hydrogen bond charge density. Polar aprotic solvents such as DMF favour a-alkylation, and nonpolar aprotic solvents such as toluene favourC-alkylation of phenol. For more reactive and soluble fluorides, such as tetrabu~ylammoniumfluoride, the polar aprotic solvent favours a-alkylation and nonpolar aprotic solvent favours fluorination. Freeze-dried potassium fluoride is a better catalytic agent in hydrogen bond assisted alkylation reactions of phenol than the oven-dried fluoride. The presence of water in the alkylation reactions reduces the expected yield drastically. The tolerance of the reaction to water has also been studied. The use ofa phase transfer catalyst such as tetrabutylammonium bromide in the alkylation reactions of phenol in the presence of potassium fluoride is very effective under anhydrous conditions. Sterically hindered phenols such as 2,6-ditertiarybutyl-4-methyl phenol could not be alkylated even by using the more reactive fluorides, such as tetrabutylammonium fluoride in either polar or nonpolar aprotic solvents. Attempts were also made to alkylate phenols in the presence of triphenylphosphine oxide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect that plants {Typha latifolia) as well as root-bed medium physical and chemical characteristics have on the treatment of primary treated domestic wastewater within a vertical flow constructed wetland system was investigated. Five sets of cells, with two cells in each set, were used. Each cell was made of concrete and measured 1 .0 m X 1 .0 m and was 1.3 m deep. Four different root-bed media were tested : Queenston Shale, Fonthill Sand, Niagara Shale and a Michigan Sand. Four of the sets contained plants and a single type of root-bed medium. The influence of plants was tested by operating a Queenston Shale set without plants. Due to budget constraints no replicates were constructed. All of the sets were operated independently and identically for twenty-eight months. Twelve months of data are presented here, collected after 16 months of continuous operation. Root-bed medium type did not influence BOD5 removal. All of the sets consistently met Ontario Ministry of Environment (MOE) requirements (<25 mg/L) for BOD5 throughout the year. The 12 month average BOD5 concentration from all sets with plants was below 2.36 mg/L. All of the sets were within MOE discharge requirements (< 25 mg/L) for suspended solids with set effluent concentrations ranging from 1.53 to 14.80 mg/L. The Queenston Shale and Fonthill Sand media removed the most suspended solids while the Niagara Shale set produced suspended solids. The set containing Fonthill Sand was the only series to meet MOE discharge requirements (< Img/L) for total phosphorus year-round with a twelve month mean effluent concentration of 0.23 mg/L. Year-round all of the root-bed media were well below MOE discharge requirements (< 20mg/L in winter and < 10 mg/L in sumnner) for ammonium. The Queenston Shale and Fonthill Sand sets removed the most total nitrogen. Plants had no effect on total nitrogen removal, but did influence how nitrogen was cycled within the system. Plants increased the removal of suspended solids by 14%, BOD5 by 10% and total phosphorus by 22%. Plants also increased the amount of dissolved oxygen that entered the system. During the plant growing season removal of total phosphorus was better in all sets with plants regardless of media type. The sets containing Queenston Shale and Fonthill Sand media achieved the best results and plants in the Queenston Shale set increased treatment efficiency for every parameter except nitrogen. Vertical flow wetland sewage treatment systems can be designed and built to consistently meet MOE discharge requirements year-round for BOD5, suspended solids, total phosphorus and ammonium. This system Is generally superior to the free water systems and sub-surface horizontal flow systems in cold climate situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first part of this thesis studied the capacity of amino acids and enzymes to catalyze the hydrolysis and condensation of tetraethoxysilane and phenyltrimethoxysilane. Selected amino acids were shown to accelerate the hydrolysis and condensation of tetraethoxysilane under ambient temperature, pressure and at neutral pH (pH 7±0.02). The nature of the side chain of the amino acid was important in promoting hydrolysis and condensation. Several proteases were shown to have a capacity to hydrolyze tri- and tet-ra- alkoxysilanes under the same mild reaction conditions. The second part of this thesis employed an immobilized Candida antarctica lipase B (Novozym-435, N435) to produce siloxane-containing polyesters, polyamides, and polyester amides under solvent-free conditions. Enzymatic activity was shown to be temperature dependent, increasing until enzyme denaturation became the dominant pro-cess, which typically occurred between 120-130ᵒC. The residual activity of N435 was, on average, greater than 90%, when used in the synthesis of disiloxane-containing polyesters, regardless of the polymerization temperature except at the very highest temperatures, 140-150ᵒC. A study of the thermal tolerance of N435 determined that, over ten reaction cycles, there was a decrease in the initial rate of polymerization with each consecutive use of the catalyst. No change in the degree of monomer conversion after a 24 hour reaction cycle was found.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The preparation and characterization of two families of building blocks for molecule-based magnetic and conducting materials are described in three projects. In the first project the synthesis and characterization of three bis-imine ligands LI - L3 is reported. Coordination of LI to a series of metal salts afforded the five novel coordination complexes Sn(L4)C4 (I), [Mn(L4)(u-CI)(CI)(EtOH)h (II), [CU(L4)(u-sal) h(CI04)2 (sal = salicylaldehyde anion) (III), [Fe(Ls)2]CI (IV) and [Fe(LI)h(u-O) (V). All complexes have been structurally and magnetically characterized. X-ray diffraction studies revealed that, upon coordination to Lewis acidic metal salts, the imine bonds of LI are susceptible to nucleophilic attack. As a consequence, the coordination complexes (I) - (IV) contain either the cyclised ligand L4 or hydrolysed ligand Ls. In contrast, the dimeric Fe3+ complex (V) comprises two intact ligand LI molecules. In. this complex, the ligand chelates two Fe(III) centres in a bis-bidentate manner through the lone pairs of a phenoxy oxygen and an imine nitrogen atom. Magnetic studies of complexes (II-V) indicate that the dominant interactions between neighbouring metal centres in all of the complexes are antiferromagnetic. In the second project the synthesis and characterization two families of TTF donors, namely the cyano aryl compounds (VI) - (XI) and the his-aryl TTF derivatives (XII) - (XIV) are reported. The crystal structures of compounds (VI), (VII), (IX) and (XII) exhibit regular stacks comprising of neutral donors. The UV -Vis spectra of compounds (VI) - (XIV) present an leT band, indicative of the transfer of electron density from the TTF donors to the aryl acceptor molecules. Chemical oxidation of donors (VI), (VII), (IX) and (XII) with iodine afforded a series of CT salts that where possible have been characterized by single crystal X -ray diffraction. Structural studies showed that the radical cations in these salts are organized in stacks comprising of dimers of oxidized TTF donors. All four salts behave as semiconductors, displaying room temperature conductivities ranging from 1.852 x 10-7 to 9.620 X 10-3 Scm-I. A second series of CT salts were successfully prepared via the technique of electrocrystallization. Following this methodology, single crystals of two CT salts were obtained. The single crystal X-ray structures of both salts are isostructural, displaying stacks formed by trimers of oxidized donors. Variable temperature conductivity measurements carried out on this series of CT salts reveal they also are semiconductors with conductivities ranging from 2.94 x 10-7 to 1.960 X 10-3 S em-I at room temperature. In the third project the synthesis and characterization of a series of MII(hfac)2 coordination complexes of donor ligand (XII) where M2+ = Co2+, Cu2+, Ni2+ and Zn2+ are reported. These complexes crystallize in a head-to-tail arrangement of TTF donor and bipyridine moieties, placing the metal centres and hfac ligands are located outside the stacks. Magnetic studies of the complexes (XV) - (XVIII) indicate that the bulky hfac ligands prevent neighbouring metal centres from assembling in close proximity, and thus they are magnetically isolated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse porte sur l’évaluation de la cohérence du réseau conceptuel démontré par des étudiants de niveau collégial inscrits en sciences de la nature. L’évaluation de cette cohérence s’est basée sur l’analyse des tableaux de Burt issus des réponses à des questionnaires à choix multiples, sur l’étude détaillée des indices de discrimination spécifique qui seront décrits plus en détail dans le corps de l’ouvrage et sur l’analyse de séquences vidéos d’étudiants effectuant une expérimentation en contexte réel. Au terme de ce projet, quatre grands axes de recherche ont été exploré. 1) Quelle est la cohérence conceptuelle démontrée en physique newtonienne ? 2) Est-ce que la maîtrise du calcul d’incertitude est corrélée au développement de la pensée logique ou à la maîtrise des mathématiques ? 3) Quelle est la cohérence conceptuelle démontrée dans la quantification de l’incertitude expérimentale ? 4) Quelles sont les procédures concrètement mise en place par des étudiants pour quantifier l’incertitude expérimentale dans un contexte de laboratoire semi-dirigé ? Les principales conclusions qui ressortent pour chacun des axes peuvent se formuler ainsi. 1) Les conceptions erronées les plus répandues ne sont pas solidement ancrées dans un réseau conceptuel rigide. Par exemple, un étudiant réussissant une question sur la troisième loi de Newton (sujet le moins bien réussi du Force Concept Inventory) montre une probabilité à peine supérieure de réussir une autre question sur ce même sujet que les autres participants. De nombreux couples de questions révèlent un indice de discrimination spécifique négatif indiquant une faible cohérence conceptuelle en prétest et une cohérence conceptuelle légèrement améliorée en post-test. 2) Si une petite proportion des étudiants ont montré des carences marquées pour les questions reliées au contrôle des variables et à celles traitant de la relation entre la forme graphique de données expérimentales et un modèle mathématique, la majorité des étudiants peuvent être considérés comme maîtrisant adéquatement ces deux sujets. Toutefois, presque tous les étudiants démontrent une absence de maîtrise des principes sous-jacent à la quantification de l’incertitude expérimentale et de la propagation des incertitudes (ci-après appelé métrologie). Aucune corrélation statistiquement significative n’a été observée entre ces trois domaines, laissant entendre qu’il s’agit d’habiletés cognitives largement indépendantes. Le tableau de Burt a pu mettre en lumière une plus grande cohérence conceptuelle entre les questions de contrôle des variables que n’aurait pu le laisser supposer la matrice des coefficients de corrélation de Pearson. En métrologie, des questions équivalentes n’ont pas fait ressortir une cohérence conceptuelle clairement démontrée. 3) L’analyse d’un questionnaire entièrement dédié à la métrologie laisse entrevoir des conceptions erronées issues des apprentissages effectués dans les cours antérieurs (obstacles didactiques), des conceptions erronées basées sur des modèles intuitifs et une absence de compréhension globale des concepts métrologiques bien que certains concepts paraissent en voie d’acquisition. 4) Lorsque les étudiants sont laissés à eux-mêmes, les mêmes difficultés identifiées par l’analyse du questionnaire du point 3) reviennent ce qui corrobore les résultats obtenus. Cependant, nous avons pu observer d’autres comportements reliés à la mesure en laboratoire qui n’auraient pas pu être évalués par le questionnaire à choix multiples. Des entretiens d’explicitations tenus immédiatement après chaque séance ont permis aux participants de détailler certains aspects de leur méthodologie métrologique, notamment, l’emploi de procédures de répétitions de mesures expérimentales, leurs stratégies pour quantifier l’incertitude et les raisons sous-tendant l’estimation numérique des incertitudes de lecture. L’emploi des algorithmes de propagation des incertitudes a été adéquat dans l’ensemble. De nombreuses conceptions erronées en métrologie semblent résister fortement à l’apprentissage. Notons, entre autres, l’assignation de la résolution d’un appareil de mesure à affichage numérique comme valeur de l’incertitude et l’absence de procédures d’empilement pour diminuer l’incertitude. La conception que la précision d’une valeur numérique ne peut être inférieure à la tolérance d’un appareil semble fermement ancrée.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La vie de famille avec un adolescent comporte son lot de défis. Les émotions de l’adolescent qui se présentent parfois comme des montagnes russes peuvent rendre les relations tendues et difficiles au sein de la cellule familiale, voire même au-delà de celle-ci. Par son caractère inattendu, l’avènement d’un traumatisme craniocérébral (TCC) chez l’adolescent vient fragiliser encore davantage la dynamique familiale. En outre, la myriade d’impacts engendrés par le TCC contraint la famille à modifier son projet de vie en s’investissant ensemble pour le reconstruire. La résilience devant une situation de traumatisme ne se manifeste pas de la même façon pour toutes les familles qui y sont confrontées. Certaines d’entre elles réussissent à se transformer positivement, tandis que d’autres n’y parviennent pas ou manifestent plus de difficultés. Il convient alors d’actualiser des approches de soins interdisciplinaires centrées sur la famille qui favoriseraient la reconnaissance des éléments pouvant soutenir son processus de résilience à travers cette épreuve et, enfin, aider à transformer son projet de vie. Avec comme perspective disciplinaire le modèle humaniste des soins infirmiers (Cara, 2012; Cara & Girard, 2013; Girard & Cara, 2011), cette étude qualitative et inductive (LoBiondo-Wood, Haber, Cameron, & Singh, 2009), soutenue par une approche collaborative de recherche (Desgagné, 1997), a permis la coconstruction des composantes d’un programme d’intervention en soutien à la résilience familiale, avec des familles dont un adolescent est atteint d’un TCC modéré ou sévère et des professionnels de la réadaptation. Le modèle de développement et de validation d’interventions complexes (Van Meijel, Gamel, Van Swieten-Duijfjes, & Grypdonck, 2004) a structuré la collecte des données en trois volets. Le premier volet consistait à identifier les composantes du programme d’intervention selon les familles (n=6) et les professionnels de la réadaptation (n=5). La priorisation et la validation des composantes du programme d’intervention, soit respectivement le deuxième et troisième volets, se sont réalisées auprès de ces mêmes familles (n=6 au volet 2 et n=4 au volet 3) et professionnels de la réadaptation (n=5 aux volets 2 et 3). Le processus d’analyse des données (Miles & Huberman, 2003) a repéré cinq thèmes intégrateurs, considérés comme les composantes du programme d’intervention en soutien à la résilience familiale à la suite du TCC modéré ou sévère d’un adolescent. Ce sont : 1) les caractéristiques de la famille et ses influences; 2) les stratégies familiales positives; 3) le soutien familial et social; 4) la prise en charge de l’aspect occupationnel et; 5) l’apport de la communauté et des professionnels de la santé. Les résultats issus de ce processus de coconstruction ont produit une matrice solide, suffisamment flexible pour pouvoir s’adapter aux différents contextes dans lesquels évoluent les familles et les professionnels de la réadaptation. Cette étude offre en outre des avenues intéressantes tant pour les praticiens que pour les gestionnaires et les chercheurs en sciences infirmières et dans d’autres disciplines quant à la mise en place de stratégies concrètes visant à soutenir le processus de résilience des familles dans des situations particulièrement difficiles de leur vie.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As shown by different scholars, the idea of “author” is not absolute or necessary. On the contrary, it came to life as an answer to the very practical needs of an emerging print technology in search of an economic model of its own. In this context, and according to the criticism of the notion of “author” made during the 1960–70s (in particular by Barthes and Foucault), it would only be natural to consider the idea of the author being dead as a global claim accepted by all scholars. Yet this is not the case, because, as Rose suggests, the idea of “author” and the derived notion of copyright are still too important in our culture to be abandoned. But why such an attachment to the idea of “author”? The hypothesis on which this chapter is based is that the theory of the death of the author—developed in texts such as What is an Author? by Michel Foucault and The Death of the Author by Roland Barthes—did not provide the conditions for a shift towards a world without authors because of its inherent lack of concrete editorial practices different from the existing ones. In recent years, the birth and diffusion of the Web have allowed the concrete development of a different way of interpreting the authorial function, thanks to new editorial practices—which will be named “editorialization devices” in this chapter. Thus, what was inconceivable for Rose in 1993 is possible today because of the emergence of digital technology—and in particular, the Web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In India industrial pollution has become a subject of increasing concern.Incidents of industrial pollution have been reported from many parts of the country. Cochin, the collection site of the present study, being the industrial capital of Kerela is also a harbour, is vulnerable to pollution by trace metal contaminants. In the recent times, pollutants of greatest concern in the aquatic environment are those which are persistent such as toxic heavy metals and the chlorinated hydrocarbons which include insecticides and pesticides.The animals collected from the clam bed situated on the northern side af Cochin bermouth are subject to wide fluctuations in salinity both seasonal and tidal. also; salinity is considered as an important parameter influencing the.-physiological functioning of an organism. Hence, the salinity tolerance of the animal is worked out. Considering the potential vulnerability of Cochin backwaters to heavy metal pollution, the impact of heavy metal copper (II) on the bivalve Sunetta sripta was conceived. Static bioassays were conducted for the determination of the sublethal concentrations of the metal as a preliminary step towards the toxicity studies. Oxygen consumption and filtration rate which are considered as reliable sublethal toxicity indices were employed for investigating the toxic effects of the metal. Bioaccumulation, a physiological phenomenon which can be of importance from the public health point of view, and also in the assessment of environmental quality is also dealt with.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Very little is known about the tolerance of the pensoid prawns in Indian waters under varying environment conditions ,except for a note on the salinityon The growth of the juvenile papacus indicus by sreekumaran nair and krishnankutty there seems to be no work on this aspect besides the oxygen consumption of metsponecus dobsoni which is a major constituent of prawn fishery in this region has not been studied so far.T he present work comprises studies on the occurrence and abudance of penacid prawnsin two major estuaries in Kerala the kayamkulam lake and cochin backwaters the salinity and tempeture tolerance the effect of salinity on the growth of three comercially important prawns of kerala namely pensecus indicus, ,metaponaeus dobsoni, M monoceros and the respiratory metabolism of M. dobsoni.