967 resultados para 2015 election process
Resumo:
In this research project, I have integrated two research streams on international strategic decisions making in international firms: upper echelons or top management teams (TMT) internationalization research and international strategic decision making process research. Both research streams in international business literature have evolved independently, but there is a potential in combining these two streams of research. The first empirical paper “TMT internationalization and international strategic decision making process: a decision level analysis of rationality, speed, and performance” explores the influence of TMT internationalization on strategic decision rationality and speed and, subsequently, their effect on international strategic decision effectiveness (performance). The results show that the internationalization of TMT is positively related to decision effectiveness and this relationship is mediated by decision rationality while the hypotheses regarding the association between TMT internationalization and decision speed, and the mediating effect of speed were not supported. The second paper “TMT internationalization and international strategic decision rationality: the mediating role of international information” of my thesis is a simple but logical extension of first paper. The first paper showed that TMT Internationalization has a significant positive effect on international strategic decision rationality. The second paper explicitly showed that TMT internationalization affect on international strategic decision rationality comes from two sources: international experience (personal international knowledge and information) and international information collected from managerial international contacts. For this research project, I have collected data from international software firms in Pakistan. My research contributes to the literature on upper echelons theory and strategic decision making in context of international business and international firms by explicitly examining the link between TMT internationalization and characteristics of strategic decisions making process (i.e. rationality and speed) in international firms and their possible mediating effect on performance.
The synthesis of maleic anhydride: study of a new process and improvement of the industrial catalyst
Resumo:
Maleic anhydride is an important chemical intermediate mainly produced by the selective oxidation of n-butane, an industrial process catalyzed by vanadyl pyrophosphate-based materials, (VO)2P2O7. The first topic was investigated in collaboration with a company specialized in the production of organic anhydrides (Polynt SpA), with the aim of improving the performance of the process for the selective oxidation of n-butane to maleic anhydride, comparing the behavior of an industrial vanadyl pyrophosphate catalysts when utilized either in the industrial plant or in lab-scale reactor. The study was focused on how the catalyst characteristics and reactivity are affected by the reaction conditions and how the addition of a dopant can enhance the catalytic performance. Moreover, the ageing of the catalyst was studied, in order to correlate the deactivation process with the modifications occurring in the catalyst. The second topic was produced within the Seventh Framework (FP7) European Project “EuroBioRef”. The study was focused on a new route for the synthesis of maleic anhydride starting from an alternative reactant produced by fermentation of biomass:“bio-1-butanol”. In this field, the different possible catalytic configurations were investigated: the process was divided into two main reactions, the dehydration of 1-butanol to butenes and the selective oxidation of butenes to maleic anhydride. The features needed to catalyze the two steps were analyzed and different materials were proposed as catalysts, namely Keggin-type polyoxometalates, VOPO4∙2H2O and (VO)2P2O7. The reactivity of 1-butanol was tested under different conditions, in order to optimize the performance and understand the nature of the interaction between the alcohol and the catalyst surface. Then, the key intermediates in the mechanism of 1-butanol oxidehydration to MA were studied, with the aim of understanding the possible reaction mechanism. Lastly, the reactivity of the chemically sourced 1-butanol was compared with that one of different types of bio-butanols produced by biomass fermentation.
Resumo:
The purpose of the first part of the research activity was to develop an aerobic cometabolic process in packed bed reactors (PBR) to treat real groundwater contaminated by trichloroethylene (TCE) and 1,1,2,2-tetrachloroethane (TeCA). In an initial screening conducted in batch bioreactors, different groundwater samples from 5 wells of the contaminated site were fed with 5 growth substrates. The work led to the selection of butane as the best growth substrate, and to the development and characterization from the site’s indigenous biomass of a suspended-cell consortium capable to degrade TCE with a 90 % mineralization of the organic chlorine. A kinetic study conducted in batch and continuous flow PBRs and led to the identification of the best carrier. A kinetic study of butane and TCE biodegradation indicated that the attached-cell consortium is characterized by a lower TCE specific degredation rates and by a lower level of mutual butane-TCE inhibition. A 31 L bioreactor was designed and set up for upscaling the experiment. The second part of the research focused on the biodegradation of 4 polymers, with and with-out chemical pre-treatments: linear low density polyethylene (LLDPE), polyethylene (PP), polystyrene (PS) and polyvinyl chloride (PVC). Initially, the 4 polymers were subjected to different chemical pre-treatments: ozonation and UV/ozonation, in gaseous and aqueous phase. It was found that, for LLDPE and PP, the coupling UV and ozone in gas phase is the most effective way to oxidize the polymers and to generate carbonyl groups on the polymer surface. In further tests, the effect of chemical pretreatment on polyner biodegrability was studied. Gas-phase ozonated and virgin polymers were incubated aerobically with: (a) a pure strain, (b) a mixed culture of bacteria; and (c) a fungal culture, together with saccharose as a co-substrate.
Resumo:
In this study a novel method MicroJet reactor technology was developed to enable the custom preparation of nanoparticles. rnDanazol/HPMCP HP50 and Gliclazide/Eudragit S100 nanoparticles were used as model systems for the investigation of effects of process parameters and microjet reactor setup on the nanoparticle properties during the microjet reactor construction. rnFollowing the feasibility study of the microjet reactor system, three different nanoparticle formulations were prepared using fenofibrate as model drug. Fenofibrate nanoparticles stabilized with poloxamer 407 (FN), fenofibrate nanoparticles in hydroxypropyl methyl cellulose phthalate (HPMCP) matrix (FHN) and fenofibrate nanoparticles in HPMCP and chitosan matrix (FHCN) were prepared under controlled precipitation using MicroJet reactor technology. Particle sizes of all the nanoparticle formulations were adjusted to 200-250 nm. rnThe changes in the experimental parameters altered the system thermodynamics resulting in the production of nanoparticles between 20-1000 nm (PDI<0.2) with high drug loading efficiencies (96.5% in 20:1 polymer:drug ratio).rnDrug releases from all nanoparticle formulations were fast and complete after 15 minutes both in FaSSIF and FeSSIF medium whereas in mucodhesiveness tests, only FHCN formulation was found to be mucoadhesive. Results of the Caco-2 studies revealed that % dose absorbed values were significantly higher (p<0.01) for FHCN in both cases where FaSSIF and FeSSIF were used as transport buffer.rn
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
L'argomento trattato in questo elaborato riguarda le teorie, metodologie e motivazioni che hanno portato allo sviluppo di un'applicazione web per la riconciliazione incassi, funzione peculiare della contabilità di un'azienda. Alla base delle scelte progettuali adottate, vi è una serie di studi sui processi e su come questi possano influenzare l'analisi e lo sviluppo dell'applicazione stessa. Per poter effettuare una transizione di questo tipo, sono state adottate metodologie come il Business Process Management e inevitabilmente il Business Process Re-engineering, che consentono di modificare, migliorare, adattare e ottimizzare i processi.
Resumo:
The fuzzy analytical network process (FANP) is introduced as a potential multi-criteria-decision-making (MCDM) method to improve digital marketing management endeavors. Today’s information overload makes digital marketing optimization, which is needed to continuously improve one’s business, increasingly difficult. The proposed FANP framework is a method for enhancing the interaction between customers and marketers (i.e., involved stakeholders) and thus for reducing the challenges of big data. The presented implementation takes realities’ fuzziness into account to manage the constant interaction and continuous development of communication between marketers and customers on the Web. Using this FANP framework, the marketers are able to increasingly meet the varying requirements of their customers. To improve the understanding of the implementation, advanced visualization methods (e.g., wireframes) are used.
Resumo:
Offset printing is a common method to produce large amounts of printed matter. We consider a real-world offset printing process that is used to imprint customer-specific designs on napkin pouches. The production equipment used gives rise to various technological constraints. The planning problem consists of allocating designs to printing-plate slots such that the given customer demand for each design is fulfilled, all technological and organizational constraints are met and the total overproduction and setup costs are minimized. We formulate this planning problem as a mixed-binary linear program, and we develop a multi-pass matching-based savings heuristic. We report computational results for a set of problem instances devised from real-world data.
Resumo:
Multi-objective optimization algorithms aim at finding Pareto-optimal solutions. Recovering Pareto fronts or Pareto sets from a limited number of function evaluations are challenging problems. A popular approach in the case of expensive-to-evaluate functions is to appeal to metamodels. Kriging has been shown efficient as a base for sequential multi-objective optimization, notably through infill sampling criteria balancing exploitation and exploration such as the Expected Hypervolume Improvement. Here we consider Kriging metamodels not only for selecting new points, but as a tool for estimating the whole Pareto front and quantifying how much uncertainty remains on it at any stage of Kriging-based multi-objective optimization algorithms. Our approach relies on the Gaussian process interpretation of Kriging, and bases upon conditional simulations. Using concepts from random set theory, we propose to adapt the Vorob’ev expectation and deviation to capture the variability of the set of non-dominated points. Numerical experiments illustrate the potential of the proposed workflow, and it is shown on examples how Gaussian process simulations and the estimated Vorob’ev deviation can be used to monitor the ability of Kriging-based multi-objective optimization algorithms to accurately learn the Pareto front.
Resumo:
The goal of the present article is to introduce dual-process theories – in particular the default-interventionist model – as an overarching framework for attention-related research in sports. Dual-process theories propose that two different types of processing guide human behavior. Type 1 processing is independent of available working memory capacity (WMC), whereas Type 2 processing depends on available working memory capacity. We review the latest theoretical developments on dual-process theories and present evidence for the validity of dual-process theories from various domains. We demonstrate how existing sport psychology findings can be integrated within the dual-process framework. We illustrate how future sport psychology research might benefit from adopting the dual-process framework as a meta-theoretical framework by arguing that the complex interplay between Type 1 and Type 2 processing has to be taken into account in order to gain a more complete understanding of the dynamic nature of attentional processing during sport performance at varying levels of expertise. Finally, we demonstrate that sport psychology applications might benefit from the dual-process perspective as well: dual-process theories are able to predict which behaviors can be more successfully executed when relying on Type 1 processing and which behaviors benefit from Type 2 processing.
Resumo:
The cichlid fish radiations in the African Great Lakes differ from all other known cases of rapid speciation in vertebrates by their spectacular trophic diversity and richness of sympatric species, comparable to the most rapid angiosperm radiations. I review factors that may have facilitated these radiations and compare these with insights from recent work on plant radiations. Work to date suggests that it was a coincidence of ecological opportunity, intrinsic ecological versatility and genomic flexibility, rapidly evolving behavioral mate choice and large amounts of standing genetic variation that permitted these spectacular fish radiations. I propose that spatially orthogonal gradients in the fit of phenotypes to the environment facilitate speciation because they allow colonization of alternative fitness peaks during clinal speciation despite local disruptive selection. Such gradients are manifold in lakes because of the interaction of water depth as an omnipresent third spatial dimension with other fitness-relevant variables. I introduce a conceptual model of adaptive radiation that integrates these elements and discuss its applicability to, and predictions for, plant radiations.
Resumo:
We present a precise theoretical prediction for the signal-background interference process of gg(→ h ∗) → ZZ, which is useful to constrain the Higgs boson decay width and to measure Higgs couplings to the SM particles. The approximate NNLO K-factor is in the range of 2.05 − 2.45 (1.85 − 2.25), depending on M ZZ , at the 8 (13) TeV LHC. And the soft gluon resummation can increase the approximate NNLO result by about 10% at both the 8 TeV and 13 TeV LHC. The theoretical uncertainties including the scale, uncalculated multi-loop amplitudes of the background and PDF+αs are roughly O(10%) at NNLL′. We also confirm that the approximate K-factors in the interference and the pure signal processes are the same.