962 resultados para complex issues
Resumo:
A környezeti hatások rendszerint túlmutatnak egy vállalat határain, éppen ezért az ellátási lánc kontextusban a környezeti szempontok érvényesítése során fontos szerep jut a beszerzési döntéseknek is. Számos olyan példát lehetne említeni, amikor egy adott szempont szerint egy alternatíva környezetileg előnyös, de az ellátási lánc egészét nézve már környezetterhelő. A környezeti hatások ellátási lánc szinten való mérése azonban komoly kihívásokat jelent. Ezzel jelentős kutatásokat és fejlesztéseket inspirált a téma. Az egyik olyan terület, amelyben komoly kutatási eredmények születtek, az a környezeti szempontok beszállítói értékelésbe való beépítése. A kutatások ezen irányához csatlakozva a szerzők tanulmányunkban azt keresik, hogyan lehet meghatározni az egyik legáltalánosabban használt szállítóértékelési módszerben, a súlyozott pontrendszerben egy adott szemponthoz azt a súlyt, amely mellett az adott szempont már döntésbefolyásoló tényezővé válik. Ehhez a DEA (Data Envelopment Analysis) összetett indikátorok (Composite Indicators, CI) módszerét alkalmazzák. A szempontok közös súlyának fontossága megállapításához a lineáris programozás elméletét használják. _____ Management decisions often have an environmental effect not just within the company, but outside as well, this is why supply chain context is highlighted in literature. Measuring environmental issues of supply decisions raise a lot of problems from methodological and practical point of view. This inspires a rapidly growing literature as a lot of studies were published focusing on how to incorporate environmental issues into supplier evaluation. This paper contributes to this stream of research as it develops a method to help weight selection. In the authors’ paper the method of Data Envelope Analysis (DEA) is used to study the extension of traditional supplier selection methods with environmental factors. The selection of the weight system can control the result of the selection process.
Resumo:
This presentation focuses on methods for the evaluation of complex policies. In particular, it focuses on evaluating interactions between policies and the extent to which two or more interacting policies mutually reinforce or hinder one another, in the area of environmental sustainability. Environmental sustainability is increasingly gaining recognition as a complex policy area, requiring a more systemic perspective and approach (e.g. European Commission, 2011). Current trends in human levels of resource consumption are unsustainable, and single solutions which target isolated issues independently of the broader context have so far fallen short. Instead there is a growing call among both academics and policy practitioners for systemic change which acknowledges and engages with the complex interactions, barriers and opportunities across the different actors, sectors, and drivers of production and consumption. Policy mixes, and the combination and ordering of policies within, therefore become an important focus for those aspiring to design and manage transitions to sustainability. To this end, we need a better understanding of the interactions, synergies and conflicts between policies (Cunningham et al., 2013; Geels, 2014). As a contribution to this emerging field of research and to inform its next steps, I present a review on what methods are available to try to quantify the impacts of complex policy interactions, since there is no established method among practitioners, and I explore the merits or value of such attempts. The presentation builds on key works in the field of complexity science (e.g. Anderson, 1972), revisiting and combining these with more recent contributions in the emerging field of policy and complex systems, and evaluation (e.g. Johnstone et al., 2010). With a coalition of UK Government departments, agencies and Research Councils soon to announce the launch of a new internationally-leading centre to pioneer, test and promote innovative and inclusive methods for policy evaluation across the energy-environment-food nexus, the contribution is particularly timely.
Resumo:
Thee rise of computing and the internet have brought about an ethical eld of studies that some term information ethics, computer ethics, digital media ethics, or internet ethics e aim of this contribution is to discuss information ethics’ foundations in the context of the internet’s political economy e chapter rst looks to ground the analysis in a comparison of two information ethics approaches, namely those outlined by Rafael Capurro and Luciano Floridi It then develops, based on these foundations, analyses of the information ethical dimensions of two important areas of social media: one concerns the framing of social media by a surveillance-industrial complex in the context of Edward Snowden’s revelations and the other deals with issues of digital labour processes and issues of class that arises in this context e contribution asks ethical questions about these two phenomena that bring up issues of power, exploitation, and control in the information age It asks if, and if so, how, the approaches of Capurro and Floridi can help us to understand ethico-political aspects of the surveillance-industrial complex and digital labour
Resumo:
Unstructured mesh based codes for the modelling of continuum physics phenomena have evolved to provide the facility to model complex interacting systems. Such codes have the potential to provide a high performance on parallel platforms for a small investment in programming. The critical parameters for success are to minimise changes to the code to allow for maintenance while providing high parallel efficiency, scalability to large numbers of processors and portability to a wide range of platforms. The paradigm of domain decomposition with message passing has for some time been demonstrated to provide a high level of efficiency, scalability and portability across shared and distributed memory systems without the need to re-author the code into a new language. This paper addresses these issues in the parallelisation of a complex three dimensional unstructured mesh Finite Volume multiphysics code and discusses the implications of automating the parallelisation process.
Resumo:
The main aim of this book is to consider how the sales function informs business strategy. Although there are a number of books available that address how to manage the sales team tactically, this text addresses how sales can help organizations to become more customer oriented. Many organizations are facing escalating costs and a growth in customer power, which makes it necessary to allocate resources more strategically. The sales function can provide critical customer and market knowledge to help inform both innovation and marketing. Sales are responsible for building customer knowledge, networking both internally and externally to help create additional customer value, as well as the more traditional role of managing customer relationships and selling. The text considers how sales organizations are responding to increasing competition, more demanding customers and a more complex selling environment. We identify many of the challenges facing organisations today and offers discussions of some of the possible solutions. This book considers the changing nature of sales and how activities can be aligned within the organization, as well as marketing sensing, creating customer focus and the role of sales leadership. The text will include illustrations (short case studies) provided by a range of successful organizations operating in a number of industries. Sales and senior management play an important role in ensuring that the sales teams' activities are aligned to business strategy and in creating an environment to allow salespeople to be more successful in developing new business opportunities and building long-term profitable business relationships. One of the objectives of this book is to consider how conventional thinking has changed in the last five years and integrate it with examples from sales practice to provide a more complete picture of the role of sales within the modern organization.
Resumo:
This thesis is a research about the recent complex spatial changes in Namibia and Tanzania and local communities’ capacity to cope with, adapt to and transform the unpredictability engaged to these processes. I scrutinise the concept of resilience and its potential application to explaining the development of local communities in Southern Africa when facing various social, economic and environmental changes. My research is based on three distinct but overlapping research questions: what are the main spatial changes and their impact on the study areas in Namibia and Tanzania? What are the adaptation, transformation and resilience processes of the studied local communities in Namibia and Tanzania? How are innovation systems developed, and what is their impact on the resilience of the studied local communities in Namibia and Tanzania? I use four ethnographic case studies concerning environmental change, global tourism and innovation system development in Namibia and Tanzania, as well as mixed-methodological approaches, to study these issues. The results of my empirical investigation demonstrate that the spatial changes in the localities within Namibia and Tanzania are unique, loose assemblages, a result of the complex, multisided, relational and evolutional development of human and non-human elements that do not necessarily have linear causalities. Several changes co-exist and are interconnected though uncertain and unstructured and, together with the multiple stressors related to poverty, have made communities more vulnerable to different changes. The communities’ adaptation and transformation measures have been mostly reactive, based on contingency and post hoc learning. Despite various anticipation techniques, coping measures, adaptive learning and self-organisation processes occurring in the localities, the local communities are constrained by their uneven power relationships within the larger assemblages. Thus, communities’ own opportunities to increase their resilience are limited without changing the relations in these multiform entities. Therefore, larger cooperation models are needed, like an innovation system, based on the interactions of different actors to foster cooperation, which require collaboration among and input from a diverse set of stakeholders to combine different sources of knowledge, innovation and learning. Accordingly, both Namibia and Tanzania are developing an innovation system as their key policy to foster transformation towards knowledge-based societies. Finally, the development of an innovation system needs novel bottom-up approaches to increase the resilience of local communities and embed it into local communities. Therefore, innovation policies in Namibia have emphasised the role of indigenous knowledge, and Tanzania has established the Living Lab network.
Resumo:
Big data and AI are paving the way to promising scenarios in clinical practice and research. However, the use of such technologies might clash with GDPR requirements. Today, two forces are driving the EU policies in this domain. The first is the necessity to protect individuals’ safety and fundamental rights. The second is to incentivize the deployment of innovative technologies. The first objective is pursued by legislative acts such as the GDPR or the AIA, the second is supported by the new data strategy recently launched by the European Commission. Against this background, the thesis analyses the issue of GDPR compliance when big data and AI systems are implemented in the health domain. The thesis focuses on the use of co-regulatory tools for compliance with the GDPR. This work argues that there are two level of co-regulation in the EU legal system. The first, more general, is the approach pursued by the EU legislator when shaping legislative measures that deal with fast-evolving technologies. The GDPR can be deemed a co-regulatory solution since it mainly introduces general requirements, which implementation shall then be interpretated by the addressee of the law following a risk-based approach. This approach, although useful is costly and sometimes burdensome for organisations. The second co-regulatory level is represented by specific co-regulatory tools, such as code of conduct and certification mechanisms. These tools are meant to guide and support the interpretation effort of the addressee of the law. The thesis argues that the lack of co-regulatory tools which are supposed to implement data protection law in specific situations could be an obstacle to the deployment of innovative solutions in complex scenario such as the health ecosystem. The thesis advances hypothesis on theoretical level about the reasons of such a lack of co-regulatory solutions.
Resumo:
In this thesis, the viability of the Dynamic Mode Decomposition (DMD) as a technique to analyze and model complex dynamic real-world systems is presented. This method derives, directly from data, computationally efficient reduced-order models (ROMs) which can replace too onerous or unavailable high-fidelity physics-based models. Optimizations and extensions to the standard implementation of the methodology are proposed, investigating diverse case studies related to the decoding of complex flow phenomena. The flexibility of this data-driven technique allows its application to high-fidelity fluid dynamics simulations, as well as time series of real systems observations. The resulting ROMs are tested against two tasks: (i) reduction of the storage requirements of high-fidelity simulations or observations; (ii) interpolation and extrapolation of missing data. The capabilities of DMD can also be exploited to alleviate the cost of onerous studies that require many simulations, such as uncertainty quantification analysis, especially when dealing with complex high-dimensional systems. In this context, a novel approach to address parameter variability issues when modeling systems with space and time-variant response is proposed. Specifically, DMD is merged with another model-reduction technique, namely the Polynomial Chaos Expansion, for uncertainty quantification purposes. Useful guidelines for DMD deployment result from the study, together with the demonstration of its potential to ease diagnosis and scenario analysis when complex flow processes are involved.
Resumo:
Wild animals have been kept as pets for centuries, in Brazil companionship is one of the main reasons why wild species are legally bred and traded. This paper is an attempt to call the attention for problems concerning the welfare of wild pets involved in the trading system in Brazil. Some issues presented are: a) the significant increase in the number of wildlife breeders and traders and the difficulties faced by of the Brazilian government in controlling this activity; b) the main welfare issues faced by breeders and owners of wild pets; and c) the destination of wild pets no longer wanted. Finally, some recommendations are made having the welfare of the animals as a priority.
Resumo:
The epididymis has an important role in the maturation of sperm for fertilization, but little is known about the epididymal molecules involved in sperm modifications during this process. We have previously described the expression pattern for an antigen in epididymal epithelial cells that reacts with the monoclonal antibody (mAb) TRA 54. Immunohistochemical and immunoblotting analyses suggest that the epitope of the epididymal antigen probably involves a sugar moiety that is released into the epididymal lumen in an androgen-dependent manner and subsequently binds to luminal sperm. Using column chromatography, SDS-PAGE with in situ digestion and mass spectrometry, we have identified the protein recognized by mAb TRA 54 in mouse epididymal epithelial cells. The ∼65 kDa protein is part of a high molecular mass complex (∼260 kDa) that is also present in the sperm acrosomal vesicle and is completely released after the acrosomal reaction. The amino acid sequence of the protein corresponded to that of albumin. Immunoprecipitates with anti-albumin antibody contained the antigen recognized by mAb TRA 54, indicating that the epididymal molecule recognized by mAb TRA 54 is albumin. RT-PCR detected albumin mRNA in the epididymis and fertilization assays in vitro showed that the glycoprotein complex containing albumin was involved in the ability of sperm to recognize and penetrate the egg zona pellucida. Together, these results indicate that epididymal-derived albumin participates in the formation of a high molecular mass glycoprotein complex that has an important role in egg fertilization.
Resumo:
The aim of this work was to characterize the effects of partial inhibition of respiratory complex I by rotenone on H2O2 production by isolated rat brain mitochondria in different respiratory states. Flow cytometric analysis of membrane potential in isolated mitochondria indicated that rotenone leads to uniform respiratory inhibition when added to a suspension of mitochondria. When mitochondria were incubated in the presence of a low concentration of rotenone (10 nm) and NADH-linked substrates, oxygen consumption was reduced from 45.9 ± 1.0 to 26.4 ± 2.6 nmol O2 mg(-1) min(-1) and from 7.8 ± 0.3 to 6.3 ± 0.3 nmol O2 mg(-1) min(-1) in respiratory states 3 (ADP-stimulated respiration) and 4 (resting respiration), respectively. Under these conditions, mitochondrial H2O2 production was stimulated from 12.2 ± 1.1 to 21.0 ± 1.2 pmol H2O2 mg(-1) min(-1) and 56.5 ± 4.7 to 95.0 ± 11.1 pmol H2O2 mg(-1) min(-1) in respiratory states 3 and 4, respectively. Similar results were observed when comparing mitochondrial preparations enriched with synaptic or nonsynaptic mitochondria or when 1-methyl-4-phenylpyridinium ion (MPP(+)) was used as a respiratory complex I inhibitor. Rotenone-stimulated H2O2 production in respiratory states 3 and 4 was associated with a high reduction state of endogenous nicotinamide nucleotides. In succinate-supported mitochondrial respiration, where most of the mitochondrial H2O2 production relies on electron backflow from complex II to complex I, low rotenone concentrations inhibited H2O2 production. Rotenone had no effect on mitochondrial elimination of micromolar concentrations of H2O2. The present results support the conclusion that partial complex I inhibition may result in mitochondrial energy crisis and oxidative stress, the former being predominant under oxidative phosphorylation and the latter under resting respiration conditions.
Resumo:
Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.
Resumo:
A new platinum(II) complex with the amino acid L-tryptophan (trp), named Pt-trp, was synthesized and characterized. Elemental, thermogravimetric and ESI-QTOF mass spectrometric analyses led to the composition [Pt(C11H11N2O2)2]⋅6H2O. Infrared spectroscopic data indicate the coordination of trp to Pt(II) through the oxygen of the carboxylate group and also through the nitrogen atom of the amino group. The (13)C CP/MAS NMR spectroscopic data confirm coordination through the oxygen atom of the carboxylate group, while the (15)N CP/MAS NMR data confirm coordination of the nitrogen of the NH2 group to the metal. Density functional theory (DFT) studies were applied to evaluate the cis and trans coordination modes of trp to platinum(II). The trans isomer was shown to be energetically more stable than the cis one. The Pt-trp complex was evaluated as a cytotoxic agent against SK-Mel 103 (human melanoma) and Panc-1 (human pancreatic carcinoma) cell lines. The complex was shown to be cytotoxic over the considered cells.
Resumo:
12 Suppl 1