970 resultados para Center of pressure


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The combined effect of pressure and mild temperature treatments on bovine sarcoplasmic proteins and quality parameters was assessed. M. longissimus dorsi samples were pressurised in a range of 200–600 MPa and 10–30 °C. High Pressure Processing (HPP) induced a reduction of protein solubility (p < 0.001) compared to non-treated controls (NT), more pronounced above 200 MPa. HPP at pressures higher than 200 MPa induced a strong modification (p < 0.001) of meat colour and a reduction of water holding capacity (WHC). SDS–PAGE analysis demonstrated that HPP significantly modified the composition of the sarcoplasmic protein fraction. The pressurisation temperature mainly affected protein solubility and colour; a smaller effect was observed on protein profiles. Significant correlations (p < 0.001) between sarcoplasmic protein solubility and both expressible moisture (r = −0.78) and colour parameters (r = −0.81 to −0.91) suggest that pressure induced denaturation of sarcoplasmic proteins could influence to some extent WHC and colour modifications of beef. Changes in protein band intensities were also significantly correlated with protein solubility, meat lightness and expressible moisture. These results describe the changes induced by HPP on sarcoplasmic proteins and confirm a relationship between modification of the sarcoplasmic protein fraction and alteration of meat quality characteristics

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Garlic viruses often occur in mixed infections under field conditions. In this study, garlic samples collected in three geographical areas of Brazil were tested by Dot-ELISA for the detection of allexiviruses using monoclonal specific antibodies to detect Garlic virus A (GarV-A), Garlic virus B (GarV-B), Garlic virus C (GarV-C) and a polyclonal antiserum able to detect the three virus species mentioned plus Garlic virus D (GarV-D). The detected viruses were biologically isolated by successive passages through Chenopodium quinoa. Reverse Transcriptase Polimerase Chain Reaction (RT-PCR) was performed using primers designed from specific regions of the coat protein genes of Japanese allexiviruses available in the Genetic Bank of National Center of Biotechnology Information (NCBI). By these procedures, individual garlic virus genomes were isolated and sequenced. The nucleotide and amino acid sequence analysis and the one with serological data revealed the presence of three distinct allexiviruses GarV-C, GarV-D and a recently described allexivirus, named Garlic mite-borne filamentous virus (GarMbFV), in Brazil.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the n{body problem a central con guration is formed when the position vector of each particle with respect to the center of mass is a common scalar multiple of its acceleration vector. Lindstrom showed for n = 3 and for n > 4 that if n ? 1 masses are located at xed points in the plane, then there are only a nite number of ways to position the remaining nth mass in such a way that they de ne a central con guration. Lindstrom leaves open the case n = 4. In this paper we prove the case n = 4 using as variables the mutual distances between the particles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The influence of second phases (e.g., pyroxenes) on olivine grain size was studied by quantitative microfabric analyses of samples of the Hilti massif mantle shear zone (Semail ophiolite, Oman). The microstructures range from porphyroclastic tectonites to ultramylonites, from outside to the center of the shear zone. Starting at conditions of ridge-related flow, they formed under continuous cooling leading to progressive strain localization. The dependence of the average olivine grain size on the second-phase content can be split into a second-phase controlled and a dynamic recrystallization-controlled field. In the former, the olivine grain size is related to the ratio between the second-phase grain size and volume fraction (Zener parameter). In the latter, dynamic recrystallization manifested by a balance between grain growth and grain size reduction processes yields a stable olivine grain size. In both fields the average olivine and second-phase grain size decreases with decreasing temperature. Combining the microstructural information with deformation mechanism maps suggests that the porphyroclastic tectonites (similar to 1100 degrees C) and mylonites (similar to 800 degrees C) formed under the predominance of dislocation creep. Since olivine-rich layers are intercalated with layer parallel, polymineralic bands in the mylonites, nearly equiviscous conditions can be assumed. In the ultramylonites, diffusion creep represents the major deformation mechanism in the polymineralic layers. It is this switch in deformation mechanism from dislocation creep to diffusion creep that forces strain to localize in the fine-grained polymineralic domains at low temperatures (<similar to 700 degrees C), underlining the role of the second phases on strain localization in cooling mantle rocks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this investigation was to study the flexural fatigue strength of two prestressed steel I-beams which had previously been fabricated in connection with a jointly sponsored project under the auspices of the Iowa State Highway Commission. The beams were prestressed by deflecting them under the action of a concentrated load at the center of a simple span, then welding unstressed high strength steel plates to the top and bottom flanges to retain a predetermined amount of prestress. The beams were rolled sections of A36 steel and the plates were USS "T-1" steel. Each of the two test specimens were subjected to an identical repeated loading until a fatigue failure occurred. The loading was designed to produce stresses equivalent to those which would have occurred in a simulated bridge and amounted to 84 percent of a standard H-15 live load including impact. One of the beams sustained 2,469,100 repetitions of load to failure and the other sustained 2,756,100 cycles. Following the fatigue tests, an experimental study was made to determine the state of stress that had been retained in the prestressed steel beams. This information, upon which the calculated stresses of the test could be superimposed, provided a method of correlating the fatigue strength of the beams with the fatigue information available on the two steels involved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La hipertensión arterial (HTA) constituye uno de los problemas de salud pública más importantes por su elevada prevalencia, sus complicaciones, alta mortalidad y morbilidad y el coste que determina su control y tratamiento. Es un factor de riesgo importante para la enfermedad cardiovascular y cerebro vascular, ya que favorece la formación de placas ateroscleróticas. La HTA está presente en ambos sexos y a cualquier edad provocando una disminución en la expectativa de vida. El hábito tabáquico, la hipertensión arterial, los niveles de colesterol, la obesidad y la inactividad física, el estrés, alcohol y consumo de sal, son considerados factores de riesgo modificables. El control de la hipertensión arterial, junto con los demás factores de riesgo que provocan alteraciones cardiovasculares, es probablemente uno de los mayores problemas de salud pública en el mundo. El objetivo de este trabajo es concienciar a los pacientes hipertensos que acuden al centro de salud de Fraga sobre la importancia de adoptar hábitos de vida saludable y de evitar factores de riesgo que empeoran su enfermedad, a través de la creación de un programa de educación sanitaria.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the Pleistocene glaciations, the Alps were an efficient barrier to gene flow between isolated populations, often leading to allopatric speciation. Afterwards, the Alps strongly influenced the post-glacial recolonization of Europe and represent a major suture zone between differentiated populations. Two hybrid zones in the Swiss and French Alps between genetically and chromosomally well-differentiated species-the Valais shrew, Sorex antinorii, and the common shrew, S. araneus-were studied karyotypically and by analyzing the distribution of seven microsatellite loci. In the center of the Haslital hybrid zone the two species coexist over a distance of 900 m. Hybrid karyotypes, among them the most complex known in Sorex, are rare. F-statistics based on microsatellite data revealed a strong heterozygote deficit only in the center of the zone, due to the sympatric distribution of the two species with little hybridization between them. Structuring within the species (both F(IS) and F(ST)) was low. An hierarchical analysis showed a high level of interspecific differentiation. Results were compared with those previously reported in another hybrid zone located at Les Houches in the French Alps. Genetic structuring within and between species was comparable in both hybrid zones, although chromosomal incompatibilities are more important in Haslital, where a linkage block of the race-specific chromosomes should additionally impede gene flow. Evidence for a more restricted gene flow in Haslital comes from the genetically intermediate hybrid karyotypes, whereas in Les Houches, hybrid karyotypes are genetically identical to individuals of the pure karyotypic races. Genic and chromosomal introgression was observed in Les Houches, but not in Haslital. The possible influence of a river, separating the two species at Les Houches, on gene flow is discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Working in a NGO often involves providing life saving resources (food, medicine, equipment, water, etc) to needy populations around the globe. Such duty requires highly dedicated employees and humanitarian workers are said to face a hign degree of pressure in their daily work. Despite the evidence of taxing work demands, and a high potential for stress related problems, very few studies on occupational chronic stress have specifically looked at NGO workers. Assuming that "field stress" can relay to workers at headquarters, we carried out an exploratory study about occupational health among employees of a NGO's headquarters. We sent a questionnaire to all employees (N=130) of a NGO headquarters located in Switzerland. We used the TST questionnaire (French version of the Langner's questionnaire on psychiatric symptoms) to identify cases with potential mental health problems. We also included in the questionnaire some items about motivation, acknowledgment, work-life balance, job demand, and autonomy. A total of 75 employees answered our questionnaire (57% response rate). 44% of our sample were men (n=33) and 56% were women (n=42). The mean age was of 40 years (SD=7.6). 56% were working at the headquarters of the NGO in questions as of 2 years or less. Not surprisingly, a majority of respondents reported to be highly motivated (74%) and the meaning of work was important for 80% of them. However, 35% indicated having problems in conciliating their private and professional life. Most frequent reported symptoms included feeling "weak all over" (81%), having "trouble getting asleep often" (35%), "clogging in nose" (35%), feeling "nervous often" (33%), and "memory not all right" (33%). The score for psychiatric symptoms was high in 8 (11%) employees whose health might therefore be at risk. In comparison, other sudies showed that this proportion was 9% for French teachers and 16% for sales personnel1. Results show that symptoms of mental health problems do occur among NGO workers. Some of these symptoms are known to be linked to occupational stress. Chronic stress manifests itself first in non-specific symptoms (e.g. fatigue) and later in specific pathologies. This could explain the relatively low proportion of cases with a high score in Langner's scale than was expected. Therefore, we hypothesize a healthy worker effect. The fact that our sample is 40 years old in average, and that the turnover is quite high can also support this hypothesis. Further research is needed in order to better understand occupational stress in this specific population. An upcoming study will investigate the role of organizational factors associated with health complaints. Therefore, a longitudinal survey including quantitative and qualitative methods is appropriate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigate the spatial dependence of the exciton lifetimes in single ZnO nanowires. We have found that the free exciton and bound exciton lifetimes exhibit a maximum at the center of nanowires, while they decrease by 30% towards the tips. This dependence is explained by considering the cavity-like properties of the nanowires in combination with the Purcell effect. We show that the lifetime of the bound-excitons scales with the localization energy to the power of 3/2, which validates the model of Rashba and Gurgenishvili at the nanoscale.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Velocity-density tests conducted in the laboratory involved small 4-inch diameter by 4.58-inch-long compacted soil cylinders made up of 3 differing soil types and for varying degrees of density and moisture content, the latter being varied well beyond optimum moisture values. Seventeen specimens were tested, 9 with velocity determinations made along two elements of the cylinder, 180 degrees apart, and 8 along three elements, 120 degrees apart. Seismic energy was developed by blows of a small tack hammer on a 5/8-inch diameter steel ball placed at the center of the top of the cylinder, with the detector placed successively at four points spaced 1/2-inch apart on the side of the specimen involving wave travel paths varying from 3.36 inches to 4.66 inches in length. Time intervals were measured using a model 217 micro-seismic timer in both laboratory and field measurements. Forty blows of the hammer were required for each velocity determination, which amounted to 80 blows on 9 laboratory specimens and 120 blows on the remaining 8 cylinders. Thirty-five field tests were made over the three selected soil types, all fine-grained, using a 2-foot seismic line with hammer-impact points at 6-inch intervals. The small tack hammer and 5/8-inch steel ball was, again, used to develop seismic wave energy. Generally, the densities obtained from the velocity measurements were lower than those measured in the conventional field testing. Conclusions were reached that: (1) the method does not appear to be usable for measurement of density of essentially fine-grained soils when the moisture content greatly exceeds the optimum for compaction, and (2) due to a gradual reduction in velocity upon aging, apparently because of gradual absorption of pore water into the expandable interlayer region of the clay, the seismic test should be conducted immediately after soil compaction to obtain a meaningful velocity value.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Summary This dissertation explores how stakeholder dialogue influences corporate processes, and speculates about the potential of this phenomenon - particularly with actors, like non-governmental organizations (NGOs) and other representatives of civil society, which have received growing attention against a backdrop of increasing globalisation and which have often been cast in an adversarial light by firms - as a source of teaming and a spark for innovation in the firm. The study is set within the context of the introduction of genetically-modified organisms (GMOs) in Europe. Its significance lies in the fact that scientific developments and new technologies are being generated at an unprecedented rate in an era where civil society is becoming more informed, more reflexive, and more active in facilitating or blocking such new developments, which could have the potential to trigger widespread changes in economies, attitudes, and lifestyles, and address global problems like poverty, hunger, climate change, and environmental degradation. In the 1990s, companies using biotechnology to develop and offer novel products began to experience increasing pressure from civil society to disclose information about the risks associated with the use of biotechnology and GMOs, in particular. Although no harmful effects for humans or the environment have been factually demonstrated even to date (2008), this technology remains highly-contested and its introduction in Europe catalysed major companies to invest significant financial and human resources in stakeholder dialogue. A relatively new phenomenon at the time, with little theoretical backing, dialogue was seen to reflect a move towards greater engagement with stakeholders, commonly defined as those "individuals or groups with which. business interacts who have a 'stake', or vested interest in the firm" (Carroll, 1993:22) with whom firms are seen to be inextricably embedded (Andriof & Waddock, 2002). Regarding the organisation of this dissertation, Chapter 1 (Introduction) describes the context of the study, elaborates its significance for academics and business practitioners as an empirical work embedded in a sector at the heart of the debate on corporate social responsibility (CSR). Chapter 2 (Literature Review) traces the roots and evolution of CSR, drawing on Stakeholder Theory, Institutional Theory, Resource Dependence Theory, and Organisational Learning to establish what has already been developed in the literature regarding the stakeholder concept, motivations for engagement with stakeholders, the corporate response to external constituencies, and outcomes for the firm in terms of organisational learning and change. I used this review of the literature to guide my inquiry and to develop the key constructs through which I viewed the empirical data that was gathered. In this respect, concepts related to how the firm views itself (as a victim, follower, leader), how stakeholders are viewed (as a source of pressure and/or threat; as an asset: current and future), corporate responses (in the form of buffering, bridging, boundary redefinition), and types of organisational teaming (single-loop, double-loop, triple-loop) and change (first order, second order, third order) were particularly important in building the key constructs of the conceptual model that emerged from the analysis of the data. Chapter 3 (Methodology) describes the methodology that was used to conduct the study, affirms the appropriateness of the case study method in addressing the research question, and describes the procedures for collecting and analysing the data. Data collection took place in two phases -extending from August 1999 to October 2000, and from May to December 2001, which functioned as `snapshots' in time of the three companies under study. The data was systematically analysed and coded using ATLAS/ti, a qualitative data analysis tool, which enabled me to sort, organise, and reduce the data into a manageable form. Chapter 4 (Data Analysis) contains the three cases that were developed (anonymised as Pioneer, Helvetica, and Viking). Each case is presented in its entirety (constituting a `within case' analysis), followed by a 'cross-case' analysis, backed up by extensive verbatim evidence. Chapter 5 presents the research findings, outlines the study's limitations, describes managerial implications, and offers suggestions for where more research could elaborate the conceptual model developed through this study, as well as suggestions for additional research in areas where managerial implications were outlined. References and Appendices are included at the end. This dissertation results in the construction and description of a conceptual model, grounded in the empirical data and tied to existing literature, which portrays a set of elements and relationships deemed important for understanding the impact of stakeholder engagement for firms in terms of organisational learning and change. This model suggests that corporate perceptions about the nature of stakeholder influence the perceived value of stakeholder contributions. When stakeholders are primarily viewed as a source of pressure or threat, firms tend to adopt a reactive/defensive posture in an effort to manage stakeholders and protect the firm from sources of outside pressure -behaviour consistent with Resource Dependence Theory, which suggests that firms try to get control over extemal threats by focussing on the relevant stakeholders on whom they depend for critical resources, and try to reverse the control potentially exerted by extemal constituencies by trying to influence and manipulate these valuable stakeholders. In situations where stakeholders are viewed as a current strategic asset, firms tend to adopt a proactive/offensive posture in an effort to tap stakeholder contributions and connect the organisation to its environment - behaviour consistent with Institutional Theory, which suggests that firms try to ensure the continuing license to operate by internalising external expectations. In instances where stakeholders are viewed as a source of future value, firms tend to adopt an interactive/innovative posture in an effort to reduce or widen the embedded system and bring stakeholders into systems of innovation and feedback -behaviour consistent with the literature on Organisational Learning, which suggests that firms can learn how to optimize their performance as they develop systems and structures that are more adaptable and responsive to change The conceptual model moreover suggests that the perceived value of stakeholder contribution drives corporate aims for engagement, which can be usefully categorised as dialogue intentions spanning a continuum running from low-level to high-level to very-high level. This study suggests that activities aimed at disarming critical stakeholders (`manipulation') providing guidance and correcting misinformation (`education'), being transparent about corporate activities and policies (`information'), alleviating stakeholder concerns (`placation'), and accessing stakeholder opinion ('consultation') represent low-level dialogue intentions and are experienced by stakeholders as asymmetrical, persuasive, compliance-gaining activities that are not in line with `true' dialogue. This study also finds evidence that activities aimed at redistributing power ('partnership'), involving stakeholders in internal corporate processes (`participation'), and demonstrating corporate responsibility (`stewardship') reflect high-level dialogue intentions. This study additionally finds evidence that building and sustaining high-quality, trusted relationships which can meaningfully influence organisational policies incline a firm towards the type of interactive, proactive processes that underpin the development of sustainable corporate strategies. Dialogue intentions are related to type of corporate response: low-level intentions can lead to buffering strategies; high-level intentions can underpin bridging strategies; very high-level intentions can incline a firm towards boundary redefinition. The nature of corporate response (which encapsulates a firm's posture towards stakeholders, demonstrated by the level of dialogue intention and the firm's strategy for dealing with stakeholders) favours the type of learning and change experienced by the organisation. This study indicates that buffering strategies, where the firm attempts to protect itself against external influences and cant' out its existing strategy, typically lead to single-loop learning, whereby the firm teams how to perform better within its existing paradigm and at most, improves the performance of the established system - an outcome associated with first-order change. Bridging responses, where the firm adapts organisational activities to meet external expectations, typically leads a firm to acquire new behavioural capacities characteristic of double-loop learning, whereby insights and understanding are uncovered that are fundamentally different from existing knowledge and where stakeholders are brought into problem-solving conversations that enable them to influence corporate decision-making to address shortcomings in the system - an outcome associated with second-order change. Boundary redefinition suggests that the firm engages in triple-loop learning, where the firm changes relations with stakeholders in profound ways, considers problems from a whole-system perspective, examining the deep structures that sustain the system, producing innovation to address chronic problems and develop new opportunities - an outcome associated with third-order change. This study supports earlier theoretical and empirical studies {e.g. Weick's (1979, 1985) work on self-enactment; Maitlis & Lawrence's (2007) and Maitlis' (2005) work and Weick et al's (2005) work on sensegiving and sensemaking in organisations; Brickson's (2005, 2007) and Scott & Lane's (2000) work on organisational identity orientation}, which indicate that corporate self-perception is a key underlying factor driving the dynamics of organisational teaming and change. Such theorizing has important implications for managerial practice; namely, that a company which perceives itself as a 'victim' may be highly inclined to view stakeholders as a source of negative influence, and would therefore be potentially unable to benefit from the positive influence of engagement. Such a selfperception can blind the firm from seeing stakeholders in a more positive, contributing light, which suggests that such firms may not be inclined to embrace external sources of innovation and teaming, as they are focussed on protecting the firm against disturbing environmental influences (through buffering), and remain more likely to perform better within an existing paradigm (single-loop teaming). By contrast, a company that perceives itself as a 'leader' may be highly inclined to view stakeholders as a source of positive influence. On the downside, such a firm might have difficulty distinguishing when stakeholder contributions are less pertinent as it is deliberately more open to elements in operating environment (including stakeholders) as potential sources of learning and change, as the firm is oriented towards creating space for fundamental change (through boundary redefinition), opening issues to entirely new ways of thinking and addressing issues from whole-system perspective. A significant implication of this study is that potentially only those companies who see themselves as a leader are ultimately able to tap the innovation potential of stakeholder dialogue.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abordamos el Informe del Grupo de Alto Nivel sobre la Coherencia en todo el Sistema de las Naciones Unidas («Informe del Grupo») y el proceso de reforma de las Naciones Unidas del que forma parte como grupos de la sociedad civil que tienen una larga experiencia por lo que se refiere a propugnar reformas del sistema de las Naciones Unidas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Granular flow phenomena are frequently encountered in the design of process and industrial plants in the traditional fields of the chemical, nuclear and oil industries as well as in other activities such as food and materials handling. Multi-phase flow is one important branch of the granular flow. Granular materials have unusual kinds of behavior compared to normal materials, either solids or fluids. Although some of the characteristics are still not well-known yet, one thing is confirmed: the particle-particle interaction plays a key role in the dynamics of granular materials, especially for dense granular materials. At the beginning of this thesis, detailed illustration of developing two models for describing the interaction based on the results of finite-element simulation, dimension analysis and numerical simulation is presented. The first model is used to describing the normal collision of viscoelastic particles. Based on some existent models, more parameters are added to this model, which make the model predict the experimental results more accurately. The second model is used for oblique collision, which include the effects from tangential velocity, angular velocity and surface friction based on Coulomb's law. The theoretical predictions of this model are in agreement with those by finite-element simulation. I n the latter chapters of this thesis, the models are used to predict industrial granular flow and the agreement between the simulations and experiments also shows the validation of the new model. The first case presents the simulation of granular flow passing over a circular obstacle. The simulations successfully predict the existence of a parabolic steady layer and show how the characteristics of the particles, such as coefficients of restitution and surface friction affect the separation results. The second case is a spinning container filled with granular material. Employing the previous models, the simulation could also reproduce experimentally observed phenomena, such as a depression in the center of a high frequency rotation. The third application is about gas-solid mixed flow in a vertically vibrated device. Gas phase motion is added to coherence with the particle motion. The governing equations of the gas phase are solved by using the Large eddy simulation (LES) and particle motion is predicted by using the Lagrangian method. The simulation predicted some pattern formation reported by experiment.