967 resultados para semi-classical analysis
Resumo:
This dissertation focused on the longitudinal analysis of business start-ups using three waves of data from the Kauffman Firm Survey. The first essay used the data from years 2004-2008, and examined the simultaneous relationship between a firm’s capital structure, human resource policies, and its impact on the level of innovation. The firm leverage was calculated as, debt divided by total financial resources. Index of employee well-being was determined by a set of nine dichotomous questions asked in the survey. A negative binomial fixed effects model was used to analyze the effect of employee well-being and leverage on the count data of patents and copyrights, which were used as a proxy for innovation. The paper demonstrated that employee well-being positively affects the firm's innovation, while a higher leverage ratio had a negative impact on the innovation. No significant relation was found between leverage and employee well-being. The second essay used the data from years 2004-2009, and inquired whether a higher entrepreneurial speed of learning is desirable, and whether there is a linkage between the speed of learning and growth rate of the firm. The change in the speed of learning was measured using a pooled OLS estimator in repeated cross-sections. There was evidence of a declining speed of learning over time, and it was concluded that a higher speed of learning is not necessarily a good thing, because speed of learning is contingent on the entrepreneur's initial knowledge, and the precision of the signals he receives from the market. Also, there was no reason to expect speed of learning to be related to the growth of the firm in one direction over another. The third essay used the data from years 2004-2010, and determined the timing of diversification activities by the business start-ups. It captured when a start-up diversified for the first time, and explored the association between an early diversification strategy adopted by a firm, and its survival rate. A semi-parametric Cox proportional hazard model was used to examine the survival pattern. The results demonstrated that firms diversifying at an early stage in their lives show a higher survival rate; however, this effect fades over time.
Resumo:
Neuroaesthetics is the study of the brain’s response to artistic stimuli. The neuroscientist V.S. Ramachandran contends that art is primarily “caricature” or “exaggeration.” Exaggerated forms hyperactivate neurons in viewers’ brains, which in turn produce specific, “universal” responses. Ramachandran identifies a precursor for his theory in the concept of rasa (literally “juice”) from classical Hindu aesthetics, which he associates with “exaggeration.” The canonical Sanskrit texts of Bharata Muni’s Natya Shastra and Abhinavagupta’s Abhinavabharati, however, do not support Ramachandran’s conclusions. They present audiences as dynamic co-creators, not passive recipients. I believe we could more accurately model the neurology of Hindu aesthetic experiences if we took indigenous rasa theory more seriously as qualitative data that could inform future research.
Resumo:
LysR-type transcriptional regulators (LTTRs) are emerging as key circuit components in regulating microbial stress responses and are implicated in modulating oxidative stress in the human opportunistic pathogen Pseudomonas aeruginosa. The oxidative stress response encapsulates several strategies to overcome the deleterious effects of reactive oxygen species. However, many of the regulatory components and associated molecular mechanisms underpinning this key adaptive response remain to be characterised. Comparative analysis of publically available transcriptomic datasets led to the identification of a novel LTTR, PA2206, whose expression was altered in response to a range of host signals in addition to oxidative stress. PA2206 was found to be required for tolerance to H2O2 in vitro and lethality in vivo in the Zebrafish embryo model of infection. Transcriptomic analysis in the presence of H2O2 showed that PA2206 altered the expression of 58 genes, including a large repertoire of oxidative stress and iron responsive genes, independent of the master regulator of oxidative stress, OxyR. Contrary to the classic mechanism of LysR regulation, PA2206 did not autoregulate its own expression and did not influence expression of adjacent or divergently transcribed genes. The PA2214-15 operon was identified as a direct target of PA2206 with truncated promoter fragments revealing binding to the 5'-ATTGCCTGGGGTTAT-3' LysR box adjacent to the predicted -35 region. PA2206 also interacted with the pvdS promoter suggesting a global dimension to the PA2206 regulon, and suggests PA2206 is an important regulatory component of P. aeruginosa adaptation during oxidative stress.
Resumo:
This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.
Resumo:
In Europe, the concerns with the status of marine ecosystems have increased, and the Marine Directive has as main goal the achievement of Good Environmental Status (GES) of EU marine waters by 2020. Molecular tools are seen as promising and emerging approaches to improve ecosystem monitoring, and have led ecology into a new era, representing perhaps the most source of innovation in marine monitoring techniques. Benthic nematodes are considered ideal organisms to be used as biological indicator of natural and anthropogenic disturbances in aquatic ecosystems underpinning monitoring programmes on the ecological quality of marine ecosystems, very useful to assess the GES of the marine environment. dT-RFLP (directed Terminal-Restriction Fragment Length Polymorphism) allows to assess the diversity of nematode communities, but also allows studying the functioning of the ecosystem, and combined with relative real-time PCR (qPCR), provides a high-throughput semi-quantitative characterization of nematode communities. These characteristics make the two molecular tools good descriptors for the good environmental status assessment. The main aim of this study is to develop and optimize the dT-RFLP and qPCR in Mira estuary (SW coast, Portugal). A molecular phylogenetic analysis of marine and estuarine nematodes is being performed combining morphological and molecular analysis to evaluate the diversity of free-living marine nematodes in Mira estuary. After morphological identification, barcoding of 18S rDNA and COI genes are being determined for each nematode species morphologically identified. So far we generated 40 new sequences belonging to 32 different genus and 17 families, and the study has shown a good degree of concordance between traditional morphology-based identification and DNA sequences. These results will improve the assessment of marine nematode diversity and contribute to a more robust nematode taxonomy. The DNA sequences are being used to develop the dT-RFLP with the ability to easily process large sample numbers (hundreds and thousands), rather than typical of classical taxonomic or low throughput molecular analyses. A preliminary study showed that the digest enzymes used in dT-RFLP for terrestrial assemblages separated poorly the marine nematodes at taxonomic level for functional group analysis. A new digest combination was designed using the software tool DRAT (Directed Terminal Restriction Analysis Tool) to distinguished marine nematode taxa. Several solutions were provided by DRAT and tested empirically to select the solution that cuts most efficiently. A combination of three enzymes and a single digest showed to be the best solution to separate the different clusters. Parallel to this, another tool is being developed to estimate the population size (qPCR). An improvement in qPCR estimation of gene copy number using an artificial reference is being performed for marine nematodes communities to quantify the abundance. Once developed, it is proposed to validate both methodologies by determining the spatial and temporal variability of benthic nematodes assemblages across different environments. The application of these high-throughput molecular approaches for benthic nematodes will improve sample throughput and their implementation more efficient and faster as indicator of ecological status of marine ecosystems.
Resumo:
The study and preservation of museum collections requires complete knowledge and understanding of constituent materials that can be natural, synthetic, or semi-synthetic polymers. In former times, objects were incorporated in museum collections and classified solely by their appearance. New studies, prompted by severe degradation processes or conservation-restoration actions, help shed light on the materiality of objects that can contradict the original information or assumptions. The selected case study presented here is of a box dating from the beginning of the 20th century that belongs to the Portuguese National Ancient Art Museum. Museum curators classified it as a tortoiseshell box decorated with gold applications solely on the basis of visual inspection and the information provided by the donor. This box has visible signs of degradation with white veils, initially assumed to be the result of biological degradation of a proteinaceous matrix. This paper presents the methodological rationale behind this study and proposes a totally non-invasive methodology for the identification of polymeric materials in museum artifacts. The analysis of surface leachates using 1H and 13C nuclear magnetic resonance (NMR) complemented by in situ attenuated total reflection infrared spectroscopy (ATR FT-IR) allowed for full characterization of the object s substratum. The NMR technique unequivocally identified a great number of additives and ATR FT-IR provided information about the polymer structure and while also confirming the presence of additives. The pressure applied during ATR FT-IR spectroscopy did not cause any physical change in the structure of the material at the level of the surface (e.g., color, texture, brightness, etc.). In this study, variable pressure scanning electron microscopy (VP-SEM-EDS) was also used to obtain the elemental composition of the metallic decorations. Additionally, microbiologic and enzymatic assays were performed in order to identify the possible biofilm composition and understand the role of microorganisms in the biodeterioration process. Using these methodologies, the box was correctly identified as being made of cellulose acetate plastic with brass decorations and the white film was identified as being composed mainly of polymer exudates, namely sulphonamides and triphenyl phosphate.
Resumo:
Conservation Agriculture (CA) is mostly referred to in the literature as having three principles at the core of its identity: minimum soil disturbance, permanent organic soil cover and crop diversity. This farming package has been described as suitable to improve yields and livelihoods of smallholders in semi-arid regions of Kenya, which since the colonial period have been heavily subjected to tillage. Our study is based on a qualitative approach that followed local meanings and understandings of soil fertility, rainfall and CA in Ethi and Umande located in the semi-arid region of Laikipia, Kenya. Farm visits, 53 semistructured interviews, informal talks were carried out from April to June 2015. Ethi and Umande locations were part of a resettlement programme after the independence of Kenya that joined together people coming from different farming contexts. Since the 1970–80s, state and NGOs have been promoting several approaches to control erosion and boost soil fertility. In this context, CA has also been promoted preferentially since 2007. Interviewees were well acquainted with soil erosion and the methods to control it. Today, rainfall amount and distribution are identified as major constraints to crop performance. Soil fertility is understood as being under control since farmers use several methods to boost it (inorganic fertilisers, manure, terraces, agroforestry, vegetation barriers). CA is recognised to deliver better yields but it is not able to perform well under severe drought and does not provide yields as high as ‘promised’ in promotion campaigns. Moreover, CA is mainly understood as “cultivating with chemicals”, “kulima na dawa”, in kiswahili. A dominant view is that CA is about minimum tillage and use of pre-emergence herbicides. It is relevant to reflect about what kind of CA is being promoted and if elements like soil cover and crop rotation are given due attention. CA based on these two ideas, minimum tillage and use of herbicides, is hard to stand as a programme to be promoted and up-scaled. Therefore CA appears not to be recognised as a convincing approach to improve the livelihoods in Laikipia.
Resumo:
With the theme of fracture of finite-strain plates and shells based on a phase-field model of crack regularization, we introduce a new staggered algorithm for elastic and elasto-plastic materials. To account for correct fracture behavior in bending, two independent phase-fields are used, corresponding to the lower and upper faces of the shell. This is shown to provide a realistic behavior in bending-dominated problems, here illustrated in classical beam and plate problems. Finite strain behavior for both elastic and elasto-plastic constitutive laws is made compatible with the phase-field model by use of a consistent updated-Lagrangian algorithm. To guarantee sufficient resolution in the definition of the crack paths, a local remeshing algorithm based on the phase- field values at the lower and upper shell faces is introduced. In this local remeshing algorithm, two stages are used: edge-based element subdivision and node repositioning. Five representative numerical examples are shown, consisting of a bi-clamped beam, two versions of a square plate, the Keesecker pressurized cylinder problem, the Hexcan problem and the Muscat-Fenech and Atkins plate. All problems were successfully solved and the proposed solution was found to be robust and efficient.
Resumo:
A new semi-implicit stress integration algorithm for finite strain plasticity (compatible with hyperelas- ticity) is introduced. Its most distinctive feature is the use of different parameterizations of equilibrium and reference configurations. Rotation terms (nonlinear trigonometric functions) are integrated explicitly and correspond to a change in the reference configuration. In contrast, relative Green–Lagrange strains (which are quadratic in terms of displacements) represent the equilibrium configuration implicitly. In addition, the adequacy of several objective stress rates in the semi-implicit context is studied. We para- metrize both reference and equilibrium configurations, in contrast with the so-called objective stress integration algorithms which use coinciding configurations. A single constitutive framework provides quantities needed by common discretization schemes. This is computationally convenient and robust, as all elements only need to provide pre-established quantities irrespectively of the constitutive model. In this work, mixed strain/stress control is used, as well as our smoothing algorithm for the complemen- tarity condition. Exceptional time-step robustness is achieved in elasto-plastic problems: often fewer than one-tenth of the typical number of time increments can be used with a quantifiable effect in accuracy. The proposed algorithm is general: all hyperelastic models and all classical elasto-plastic models can be employed. Plane-stress, Shell and 3D examples are used to illustrate the new algorithm. Both isotropic and anisotropic behavior is presented in elasto-plastic and hyperelastic examples.
Resumo:
In this study, we carried out a comparative analysis between two classical methodologies to prospect residue contacts in proteins: the traditional cutoff dependent (CD) approach and cutoff free Delaunay tessellation (DT). In addition, two alternative coarse-grained forms to represent residues were tested: using alpha carbon (CA) and side chain geometric center (GC). A database was built, comprising three top classes: all alpha, all beta, and alpha/beta. We found that the cutoff value? at about 7.0 A emerges as an important distance parameter.? Up to 7.0 A, CD and DT properties are unified, which implies that at this distance all contacts are complete and legitimate (not occluded). We also have shown that DT has an intrinsic missing edges problem when mapping the first layer of neighbors. In proteins, it may produce systematic errors affecting mainly the contact network in beta chains with CA. The almost-Delaunay (AD) approach has been proposed to solve this DT problem. We found that even AD may not be an advantageous solution. As a consequence, in the strict range up ? to 7.0 A, the CD approach revealed to be a simpler, more complete, and reliable technique than DT or AD. Finally, we have shown that coarse-grained residue representations may introduce bias in the analysis of neighbors in cutoffs up to ? 6.8 A, with CA favoring alpha proteins and GC favoring beta proteins. This provides an additional argument pointing to ? the value of 7.0 A as an important lower bound cutoff to be used in contact analysis of proteins.
Resumo:
This PhD thesis discusses antitrust enforcement of anti-competitive vertical agreements in Europe and in Brazil from an institutional perspective. It considers both the evolution of the legal framework and the application of the existing policies, with the analysis of case studies. The research highlights the main challenges of the current approaches adopted by the competition authorities in these jurisdictions and formulates specific proposals for future improvements. Because the Brazilian competition rules were originally inspired by the European legal framework, this thesis also summarizes the contemporary discussions regarding comparative law and the efficiency of transplanting laws and good practices. In a Law & Economics perspective, vertical agreements have always been a paradoxical topic and constitute one of the most dynamic disputes for antitrust enforcement. The reason for that concern is the fact that those contracts among companies are complex in nature. Taking into account this background, the thesis provides an original analysis of the pro- and anti-competitive effects of vertical agreements, based on the classical literature of Law & Economics. One of the novelties of the research is the extension of the economic analysis of vertical agreements to also consider new forms of contractual abuses in the context of digital markets, such as the contractual restrictions that are being put I practice in e-commerce platforms. The international comparative approach focuses on the Brazilian and European experiences, and opens up a reflection about the policy recommendations applied to several countries with similar economic and institutional realities.
Resumo:
This research aims at contributing to a better understanding of changes in local governments’ accounting and reporting practices. Particularly, ‘why’, ‘what’ and ‘how’ environmental aspects are included and the significance of changes across time. It adopts an interpretative approach to conduct a longitudinal analysis of case studies. Pettigrew and Whipp’s framework on context, content and process is used as a lens to distinguish changes under each dimension and analyse their interconnections. Data is collected from official documents and triangulated with semi-structured interviews. The legal framework defines as boundaries of the accounting information the territory under local governments’ jurisdiction and their immediate surrounding area. Organisational environmental performance and externalities are excluded from the requirements. An interplay between the local outer context, political commitment and organisational culture justifies the implementation of changes beyond what is regulated and the implementation of transformational changes. Local governments engage in international networks to gain access to funding and implement changes, leading to adopting the dominant environmental agenda. Key stakeholders, like citizens, are not engaged in the accounting and reporting process. Thus, there is no evidence that the environmental aspects addressed and related changes align with stakeholders’ needs and expectations, which jeopardises its significance. Findings from the current research have implications in other EU member states due to the harmonisation of accounting and reporting practices and the common practice across the EU in using external funding to conceptualise and implement changes. This implies that other local governments could also be representing a limited account related to environmental aspects.
Resumo:
A relevant problem of polyolefins processing is the presence of volatile and semi-volatile compounds (VOCs and SVOCs) such as linear chains alkanes found out in final products. These VOCs can be detected by customers from the unpleasant smelt and can be an environmental issue, at the same time they can cause negative side effects during process. Since no previously standardized analytical techniques for polymeric matrix are available in bibliography, we have implemented different VOCs extraction methods and gaschromatographic analysis for quali-quantitative studies of such compounds. In literature different procedures can be found including microwave extraction (MAE) and thermo desorption (TDS) used with different purposes. TDS coupled with GC-MS are necessary for the identification of different compounds in the polymer matrix. Although the quantitative determination is complex, the results obtained from TDS/GC-MS show that by-products are mainly linear chains oligomers with even number of carbon in a C8-C22 range (for HDPE). In order to quantify these linear alkanes by-products, a more accurate GC-FID determination with internal standard has been run on MAE extracts. Regardless the type of extruder used, it is difficult to distinguish the effect of the various processes, which in any case entails having a lower-boiling substance content, lower than the corresponding virgin polymer. The two HDPEs studied can be distinguished on the basis of the quantity of analytes found, therefore the production process is mainly responsible for the amount of VOCs and SVOCs observed. The extruder technology used by Sacmi SC allows to obtain a significant reduction in VOCs compared to the conventional screw system. Thus, the result is significantly important as a lower quantity of volatile substances certainly leads to a lower migration of such materials, especially when used for food packaging.
Resumo:
In the last two decades, authors have begun to expand classical stochastic frontier (SF) models in order to include also some spatial components. Indeed, firms tend to concentrate in clusters, taking advantage of positive agglomeration externalities due to cooperation, shared ideas and emulation, resulting in increased productivity levels. Until now scholars have introduced spatial dependence into SF models following two different paths: evaluating global and local spatial spillover effects related to the frontier or considering spatial cross-sectional correlation in the inefficiency and/or in the error term. In this thesis, we extend the current literature on spatial SF models introducing two novel specifications for panel data. First, besides considering productivity and input spillovers, we introduce the possibility to evaluate the specific spatial effects arising from each inefficiency determinant through their spatial lags aiming to capture also knowledge spillovers. Second, we develop a very comprehensive spatial SF model that includes both frontier and error-based spillovers in order to consider four different sources of spatial dependence (i.e. productivity and input spillovers related to the frontier function and behavioural and environmental correlation associated with the two error terms). Finally, we test the finite sample properties of the two proposed spatial SF models through simulations, and we provide two empirical applications to the Italian accommodation and agricultural sectors. From a practical perspective, policymakers, based on results from these models, can rely on precise, detailed and distinct insights on the spillover effects affecting the productive performance of neighbouring spatial units obtaining interesting and relevant suggestions for policy decisions.
Resumo:
Today’s data are increasingly complex and classical statistical techniques need growingly more refined mathematical tools to be able to model and investigate them. Paradigmatic situations are represented by data which need to be considered up to some kind of trans- formation and all those circumstances in which the analyst finds himself in the need of defining a general concept of shape. Topological Data Analysis (TDA) is a field which is fundamentally contributing to such challenges by extracting topological information from data with a plethora of interpretable and computationally accessible pipelines. We con- tribute to this field by developing a series of novel tools, techniques and applications to work with a particular topological summary called merge tree. To analyze sets of merge trees we introduce a novel metric structure along with an algorithm to compute it, define a framework to compare different functions defined on merge trees and investigate the metric space obtained with the aforementioned metric. Different geometric and topolog- ical properties of the space of merge trees are established, with the aim of obtaining a deeper understanding of such trees. To showcase the effectiveness of the proposed metric, we develop an application in the field of Functional Data Analysis, working with functions up to homeomorphic reparametrization, and in the field of radiomics, where each patient is represented via a clustering dendrogram.