942 resultados para strategy formulation process
Resumo:
The objective of this work is to present the experience of workshops that have been developed at the University of Sao Paulo by the Integrated Library System in partnership with Research Commission. The poster presents the main results of workshops that were made in 2011, in two knowledge areas: life science and engineering, about science publication processes, and directed to graduates, pos-doctorates, researchers, professors and library staff. The realization of workshops made possible identifies gaps in different aspects of scholarly communication such as research planning, search information strategy, information organization, submission process, identification of journals with high impact, and so on, areas where professors and librarians can help. Besides, workshops reveal that the majority of participants believe in its importance. Despite the ubiquity of digital technology that transversely impacts all academic activities, it is imperative to promote efforts to find a convergence between information and media literacy in higher education and university research activities. This is particularly important when we talk about how science is produced, communicated and preserved for future use. In this scenario, libraries and librarians assume a new, more active and committed role.
Resumo:
[EN] The seminal work of Horn and Schunck [8] is the first variational method for optical flow estimation. It introduced a novel framework where the optical flow is computed as the solution of a minimization problem. From the assumption that pixel intensities do not change over time, the optical flow constraint equation is derived. This equation relates the optical flow with the derivatives of the image. There are infinitely many vector fields that satisfy the optical flow constraint, thus the problem is ill-posed. To overcome this problem, Horn and Schunck introduced an additional regularity condition that restricts the possible solutions. Their method minimizes both the optical flow constraint and the magnitude of the variations of the flow field, producing smooth vector fields. One of the limitations of this method is that, typically, it can only estimate small motions. In the presence of large displacements, this method fails when the gradient of the image is not smooth enough. In this work, we describe an implementation of the original Horn and Schunck method and also introduce a multi-scale strategy in order to deal with larger displacements. For this multi-scale strategy, we create a pyramidal structure of downsampled images and change the optical flow constraint equation with a nonlinear formulation. In order to tackle this nonlinear formula, we linearize it and solve the method iteratively in each scale. In this sense, there are two common approaches: one that computes the motion increment in the iterations, like in ; or the one we follow, that computes the full flow during the iterations, like in. The solutions are incrementally refined ower the scales. This pyramidal structure is a standard tool in many optical flow methods.
Resumo:
[EN] We analyze the discontinuity preserving problem in TV-L1 optical flow methods. This type of methods typically creates rounded effects at flow boundaries, which usually do not coincide with object contours. A simple strategy to overcome this problem consists in inhibiting the diffusion at high image gradients. In this work, we first introduce a general framework for TV regularizers in optical flow and relate it with some standard approaches. Our survey takes into account several methods that use decreasing functions for mitigating the diffusion at image contours. Consequently, this kind of strategies may produce instabilities in the estimation of the optical flows. Hence, we study the problem of instabilities and show that it actually arises from an ill-posed formulation. From this study, it is possible to come across with different schemes to solve this problem. One of these consists in separating the pure TV process from the mitigating strategy. This has been used in another work and we demonstrate here that it has a good performance. Furthermore, we propose two alternatives to avoid the instability problems: (i) we study a fully automatic approach that solves the problem based on the information of the whole image; (ii) we derive a semi-automatic approach that takes into account the image gradients in a close neighborhood adapting the parameter in each position. In the experimental results, we present a detailed study and comparison between the different alternatives. These methods provide very good results, especially for sequences with a few dominant gradients. Additionally, a surprising effect of these approaches is that they can cope with occlusions. This can be easily achieved by using strong regularizations and high penalizations at image contours.
Resumo:
Nowadays licensing practices have increased in importance and relevance driving the widespread diffusion of markets for technologies. Firms are shifting from a tactical to a strategic attitude towards licensing, addressing both business and corporate level objectives. The Open Innovation Paradigm has been embraced. Firms rely more and more on collaboration and external sourcing of knowledge. This new model of innovation requires firms to leverage on external technologies to unlock the potential of firms’ internal innovative efforts. In this context, firms’ competitive advantage depends both on their ability to recognize available opportunities inside and outside their boundaries and on their readiness to exploit them in order to fuel their innovation process dynamically. Licensing is one of the ways available to firm to ripe the advantages associated to an open attitude in technology strategy. From the licensee’s point view this implies challenging the so-called not-invented-here syndrome, affecting the more traditional firms that emphasize the myth of internal research and development supremacy. This also entails understanding the so-called cognitive constraints affecting the perfect functioning of markets for technologies that are associated to the costs for the assimilation, integration and exploitation of external knowledge by recipient firms. My thesis aimed at shedding light on new interesting issues associated to in-licensing activities that have been neglected by the literature on licensing and markets for technologies. The reason for this gap is associated to the “perspective bias” affecting the works within this stream of research. With very few notable exceptions, they have been generally concerned with the investigation of the so-called licensing dilemma of the licensor – whether to license out or to internally exploit the in-house developed technologies, while neglecting the licensee’s perspective. In my opinion, this has left rooms for improving the understanding of the determinants and conditions affecting licensing-in practices. From the licensee’s viewpoint, the licensing strategy deals with the search, integration, assimilation, exploitation of external technologies. As such it lies at the very hearth of firm’s technology strategy. Improving our understanding of this strategy is thus required to assess the full implications of in-licensing decisions as they shape firms’ innovation patterns and technological capabilities evolution. It also allow for understanding the so-called cognitive constraints associated to the not-invented-here syndrome. In recognition of that, the aim of my work is to contribute to the theoretical and empirical literature explaining the determinants of the licensee’s behavior, by providing a comprehensive theoretical framework as well as ad-hoc conceptual tools to understand and overcome frictions and to ease the achievement of satisfactory technology transfer agreements in the marketplace. Aiming at this, I investigate licensing-in in three different fashions developed in three research papers. In the first work, I investigate the links between licensing and the patterns of firms’ technological search diversification according to the framework of references of the Search literature, Resource-based Theory and the theory of general purpose technologies. In the second paper - that continues where the first one left off – I analyze the new concept of learning-bylicensing, in terms of development of new knowledge inside the licensee firms (e.g. new patents) some years after the acquisition of the license, according to the Dynamic Capabilities perspective. Finally, in the third study, Ideal with the determinants of the remuneration structure of patent licenses (form and amount), and in particular on the role of the upfront fee from the licensee’s perspective. Aiming at this, I combine the insights of two theoretical approaches: agency and real options theory.
Strategy as a matter of beliefs: the recorded music industry reinventing itself by rethinking itself
Resumo:
Managerial and organizational cognition studies the ways cognitions of managers in groups, organizations and industries shape their strategies and actions. Cognitions refer to simplified representations of managers’ internal and external environments, necessary to cope with the rich, ambiguous information requirements that characterize strategy making. Despite the important achievements in the field, many unresolved puzzles remain as to this process, particular as to the cognitive factors that condition actors in framing a response to a discontinuity, how actors can change their models in the face of a discontinuity, and the reciprocal relation between cognition and action. I leverage on the recent case of the recorded music industry in the face of the digital technology to study these issues, through a strategy-oriented study of the way early response to the discontinuity was constructed and of the subsequent evolution of this response. Through a longitudinal historical and cognitive analysis of actions and cognitions at both the industry and firm-level during the period in which the response took place (1999-2010), I gain important insights on the way historical beliefs in the industry shaped early response to the digital disruption, on the role of outsiders in promoting change through renewed vision about important issues, and on the reciprocal relationship between cognitive and strategic change.
Resumo:
This study examines the case of Vietnam and uses the method of process tracing to explore the sources of foreign policy choice and change. Foreign policy is derived from grand strategy, which refers to the full package of a state’s domestic and foreign policies. I argue that a state’s grand strategy results from the interaction of four factors—its society’s historical experience, social motivation, international power, and political contest among domestic groups. Grand strategies emerge as a response to perceived shifts in the balance of international economic, political, and military power. However, this is not to say that international pressures and incentives are translated into foreign policy. Rather, pressures and incentives are given meaning by worldviews, which reflect a society’s historical experiences of its place in the international system at traumatic junctures of its encounter with the outside world. Strategic changes in foreign policy follow what I call the “strategic algorithm,” which incorporates four major mechanisms—balancing against threat, bandwagoning with power, learning, and survival by transformation. This case study generates hypotheses for a theory of strategic choice, a theory of foreign policy transformation, and a theory of grand strategy emergence.
Resumo:
The aim of the dissertation was to test the feasibility of a new psychotherapeutic protocol for treating children and adolescents with mood and anxiety disorders: Child-Well-Being Therapy (CWBT). It originates from adult Well-Being Therapy protocol (WBT) and represents a conceptual innovation for treating affective disorders. WBT is based on the multidimensional model of well-being postulated by Ryff (eudaimonic perspective), in sequential combination with cognitive-behavioral therapy (CBT). Results showed that eudaimonic well-being was impaired in children with affective disorders in comparison with matched healthy students. A first open investigation aimed at exploring the feasibility of a 8-session CWBT protocol in a group of children with emotional and behavioural disorders has been implemented. Data showed how CWBT resulted associated to symptoms reduction, together with the decrease of externalizing problems, maintained at 1-year follow-up. CWBT triggered also an improvement in psychological well-being as well as an increasing flourishing trajectory over time. Subsequently, a modified and extended version of CWBT (12-sessions) has been developed and then tested in a controlled study with 34 patients (8 to 16 years) affected by mood and anxiety disorders. They were consecutively randomized into 3 different groups: CWBT, CBT, 6-month waiting list (WL). Both treatments resulted effective in decreasing distress and in improving well-being. Moreover, CWBT was associated with higher improvement in anxiety and showed a greater recovery rate (83%) than CBT (54%). Both groups maintained beneficial effects and CWBT group displayed a lower level of distress as well as a higher positive trend in well-being scores over time. Findings need to be interpret with caution, because of study limitations, however important clinical implications emerged. Further investigations should determine whether the sequential integration of well-being and symptom-oriented strategies could play an important role in children and adolescents’ psychotherapeutic options, fostering a successful adaptation to adversities during the growth process.
Resumo:
The protein silk fibroin (SF) from the silkworm Bombyx mori is a FDA-approved biomaterial used over centuries as sutures wire. Importantly, several evidences highlighted the potential of silk biomaterials obtained by using so-called regenerated silk fibroin (RSF) in biomedicine, tissue engineering and drug delivery. Indeed, by a water-based protocol, it is possible to obtain protein water-solution, by extraction and purification of fibroin from silk fibres. Notably, RSF can be processed in a variety of biomaterials forms used in biomedical and technological fields, displaying remarkable properties such as biocompatibility, controllable biodegradability, optical transparency, mechanical robustness. Moreover, RSF biomaterials can be doped and/or chemical functionalized with drugs, optically active molecules, growth factors and/or chemicals In this view, activities of my PhD research program were focused to standardize the process of extraction and purification of protein to get the best physical and chemical characteristics. The analysis of the chemo-physical properties of the fibroin involved both the RSF water-solution and the protein processed in film. Chemo-physical properties have been studied through: vibrational (FT-IR and Raman-FT) and optical (absorption and emission UV-VIS) spectroscopy, nuclear magnetic resonance (1H and 13C NMR), thermal analysis and thermo-gravimetric scan (DSC and TGA). In the last year of my PhD, activities were focused to study and define innovative methods of functionalization of the silk fibroin solution and films. Indeed, research program was the application of different methods of manufacturing approaches of the films of fibroin without the use of harsh treatments and organic solvents. New approaches to doping and chemical functionalization of the silk fibroin were studied. Two different methods have been identified: 1) biodoping that consists in the doping of fibroin with optically active molecules through the addition of fluorescent molecules in the standard diet used for the breeding of silkworms; 2) chemical functionalization via silylation.
Resumo:
The functionalization of substrates through the application of nanostructured coatings allows to create new materials, with enhanced properties. In this work, the development of self-cleaning and antibacterial textiles, through the application of TiO2 and Ag based nanostructured coatings was carried out. The production of TiO2 and Ag functionalized materials was achieved both by the classical dip-padding-curing method and by the innovative electrospinning process to obtain nanofibers doped with nano-TiO2 and nano-Ag. In order to optimize the production of functionalized textiles, the study focused on the comprehension of mechanisms involved in the photocatalytic and antibacterial processes and on the real applicability of the products. In particular, a deep investigation on the relationship between nanosol physicochemical characteristics, nanocoating properties and their performances was accomplished. Self-cleaning textiles with optimized properties were obtained by properly purifying and applying commercial TiO2 nanosol while the studies on the photocatalytic mechanism operating in self-cleaning application demonstrated the strong influence of hydrophilic properties and of interaction surface/radicals on final performance. Moreover, a study about the safety in handling of nano-TiO2 was carried out and risk remediation strategies, based on “safety by design” approach, were developed. In particular, the coating of TiO2 nanoparticles by a SiO2 shell was demonstrated to be the best risk remediation strategy in term of biological response and preserving of photoreactivity. The obtained results were confirmed determining the reactive oxygen species production by a multiple approach. Antibacterial textiles for biotechnological applications were also studied and Ag-coated cotton materials, with significant anti-bacterial properties, were produced. Finally, composite nanofibers were obtained merging biopolymer processing and sol-gel techniques. Indeed, electrospun nanofibers embedded with TiO2 and Ag NPs, starting from aqueous keratin based formulation were produced and the photocatalytic and antibacterial properties were assessed. The results confirmed the capability of electrospun keratin nanofibers matrix to preserve nanoparticle properties.
Resumo:
In this study a novel method MicroJet reactor technology was developed to enable the custom preparation of nanoparticles. rnDanazol/HPMCP HP50 and Gliclazide/Eudragit S100 nanoparticles were used as model systems for the investigation of effects of process parameters and microjet reactor setup on the nanoparticle properties during the microjet reactor construction. rnFollowing the feasibility study of the microjet reactor system, three different nanoparticle formulations were prepared using fenofibrate as model drug. Fenofibrate nanoparticles stabilized with poloxamer 407 (FN), fenofibrate nanoparticles in hydroxypropyl methyl cellulose phthalate (HPMCP) matrix (FHN) and fenofibrate nanoparticles in HPMCP and chitosan matrix (FHCN) were prepared under controlled precipitation using MicroJet reactor technology. Particle sizes of all the nanoparticle formulations were adjusted to 200-250 nm. rnThe changes in the experimental parameters altered the system thermodynamics resulting in the production of nanoparticles between 20-1000 nm (PDI<0.2) with high drug loading efficiencies (96.5% in 20:1 polymer:drug ratio).rnDrug releases from all nanoparticle formulations were fast and complete after 15 minutes both in FaSSIF and FeSSIF medium whereas in mucodhesiveness tests, only FHCN formulation was found to be mucoadhesive. Results of the Caco-2 studies revealed that % dose absorbed values were significantly higher (p<0.01) for FHCN in both cases where FaSSIF and FeSSIF were used as transport buffer.rn
Resumo:
Nowadays, rechargeable Li-ion batteries play an important role in portable consumer devices. Formulation of such batteries is improvable by researching new cathodic materials that present higher performances of cyclability and negligible efficiency loss over cycles. Goal of this work was to investigate a new cathodic material, copper nitroprusside, which presents a porous 3D framework. Synthesis was carried out by a low-cost and scalable co-precipitation method. Subsequently, the product was characterized by means of different techniques, such as TGA, XRF, CHN elemental analysis, XRD, Mössbauer spectroscopy and cyclic voltammetry. Electrochemical tests were finally performed both in coin cells and by using in situ cells: on one hand, coin cells allowed different formulations to be easily tested, on the other operando cycling led a deeper insight to insertion process and both chemical and physical changes. Results of several tests highlighted a non-reversible electrochemical behavior of the material and a rapid capacity fading over time. Moreover, operando techniques report that amorphisation occurs during the discharge.
Resumo:
This is the second part of a study investigating a model-based transient calibration process for diesel engines. The first part addressed the data requirements and data processing required for empirical transient emission and torque models. The current work focuses on modelling and optimization. The unexpected result of this investigation is that when trained on transient data, simple regression models perform better than more powerful methods such as neural networks or localized regression. This result has been attributed to extrapolation over data that have estimated rather than measured transient air-handling parameters. The challenges of detecting and preventing extrapolation using statistical methods that work well with steady-state data have been explained. The concept of constraining the distribution of statistical leverage relative to the distribution of the starting solution to prevent extrapolation during the optimization process has been proposed and demonstrated. Separate from the issue of extrapolation is preventing the search from being quasi-static. Second-order linear dynamic constraint models have been proposed to prevent the search from returning solutions that are feasible if each point were run at steady state, but which are unrealistic in a transient sense. Dynamic constraint models translate commanded parameters to actually achieved parameters that then feed into the transient emission and torque models. Combined model inaccuracies have been used to adjust the optimized solutions. To frame the optimization problem within reasonable dimensionality, the coefficients of commanded surfaces that approximate engine tables are adjusted during search iterations, each of which involves simulating the entire transient cycle. The resulting strategy, different from the corresponding manual calibration strategy and resulting in lower emissions and efficiency, is intended to improve rather than replace the manual calibration process.
Resumo:
The AEGISS (Ascertainment and Enhancement of Gastrointestinal Infection Surveillance and Statistics) project aims to use spatio-temporal statistical methods to identify anomalies in the space-time distribution of non-specific, gastrointestinal infections in the UK, using the Southampton area in southern England as a test-case. In this paper, we use the AEGISS project to illustrate how spatio-temporal point process methodology can be used in the development of a rapid-response, spatial surveillance system. Current surveillance of gastroenteric disease in the UK relies on general practitioners reporting cases of suspected food-poisoning through a statutory notification scheme, voluntary laboratory reports of the isolation of gastrointestinal pathogens and standard reports of general outbreaks of infectious intestinal disease by public health and environmental health authorities. However, most statutory notifications are made only after a laboratory reports the isolation of a gastrointestinal pathogen. As a result, detection is delayed and the ability to react to an emerging outbreak is reduced. For more detailed discussion, see Diggle et al. (2003). A new and potentially valuable source of data on the incidence of non-specific gastro-enteric infections in the UK is NHS Direct, a 24-hour phone-in clinical advice service. NHS Direct data are less likely than reports by general practitioners to suffer from spatially and temporally localized inconsistencies in reporting rates. Also, reporting delays by patients are likely to be reduced, as no appointments are needed. Against this, NHS Direct data sacrifice specificity. Each call to NHS Direct is classified only according to the general pattern of reported symptoms (Cooper et al, 2003). The current paper focuses on the use of spatio-temporal statistical analysis for early detection of unexplained variation in the spatio-temporal incidence of non-specific gastroenteric symptoms, as reported to NHS Direct. Section 2 describes our statistical formulation of this problem, the nature of the available data and our approach to predictive inference. Section 3 describes the stochastic model. Section 4 gives the results of fitting the model to NHS Direct data. Section 5 shows how the model is used for spatio-temporal prediction. The paper concludes with a short discussion.
Resumo:
BACKGROUND AND OBJECTIVES: There are no widely accepted criteria for the definition of hematopoietic stem cell transplant -associated microangiopathy (TAM). An International Working Group was formed to develop a consensus formulation of criteria for diagnosing clinically significant TAM. DESIGN AND METHODS: The participants proposed a list of candidate criteria, selected those considered necessary, and ranked those considered optional to identify a core set of criteria. Three obligatory criteria and four optional criteria that ranked highest formed a core set. In an appropriateness panel process, the participants scored the diagnosis of 16 patient profiles as appropriate or not appropriate for TAM. Using the experts' ratings on the patient profiles as a gold standard, the sensitivity and specificity of 24 candidate definitions of the disorder developed from the core set of criteria were evaluated. A nominal group technique was used to facilitate consensus formation. The definition of TAM with the highest score formed the final PROPOSAL. RESULTS: The Working Group proposes that the diagnosis of TAM requires fulfilment of all of the following criteria: (i) >4% schistocytes in blood; (ii) de novo, prolonged or progressive thrombocytopenia (platelet count <50 x 109/L or 50% or greater reduction from previous counts); (iii) sudden and persistent increase in lactate dehydrogenase concentration; (iv) decrease in hemoglobin concentration or increased transfusion requirement; and (v) decrease in serum haptoglobin. The sensitivity and specificity of this definition exceed 80%. INTERPRETATION AND CONCLUSIONS: The Working Group recommends that the presented criteria of TAM be adopted in clinical use, especially in scientific trials.
Resumo:
This dissertation addresses the need for a strategy that will help readers new to new media texts interpret such texts. While scholars in multimodal and new media theory posit rubrics that offer ways to understand how designers use the materialities and media found in overtly designed, new media texts (see, e.g,, Wysocki, 2004a), these strategies do not account for how readers have to make meaning from those texts. In this dissertation, I discuss how these theories, such as Lev Manovich’s (2001) five principles for determining the new media potential of texts and Gunther Kress and Theo van Leeuwen’s (2001) four strata of designing multimodal texts, are inadequate to the job of helping readers understand new media from a rhetorical perspective. I also explore how literary theory, specifically Wolfgang Iser’s (1978) description of acts of interpretation, can help audiences understand why readers are often unable to interpret the multiple, unexpected modes of communication used in new media texts. Rhetorical theory, explored in a discussion of Sonja Foss’s (2004) units of analysis, is helpful in bringing the reader into a situated context with a new media text, although these units of analysis, like Iser’s process, suggests that a reader has some prior experience interpreting a text-as-artifact. Because of this assumption of knowledge put forth by all of the theories explored within, I argue that none alone is useful to help readers engage with and interpret new media texts. However, I argue that a heuristic which combines elements from each of these theories, as well as additional ones, is more useful for readers who are new to interpreting the multiple modes of communication that are often used in unconventional ways in new media texts. I describe that heuristic in the final chapter and discuss how it can be useful to a range of texts besides those labelled new media.