959 resultados para field methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Book review: Organizations in Time, edited by R Daniel Wadhwani and Marcelo Bucheli, Oxford University Press, 2014. The title of this edited volume is slightly misleading, as its various contributions explore the potential for more historical analysis in organization studies rather than addressing issues associated with time and organizing. Hopefully this will not distract from the important achievement of this volume—important especially for business historians—in further expanding and integrating business history into management and organization studies. The various contributions, elegantly tied together by R. Daniel Wadhwani and Marcelo Bucheli in their substantial introduction (which, by the way, presents a significant contribution in its own right), opens up new sets of questions, especially in terms of future methodological and theoretical developments in the field. This book also reflects the changing institutional location of business historians, who increasingly make their careers in business schools rather than history departments, especially in Europe, reopening old questions of history as a social science. There have been several calls to teach more history in business education, such as the Carnegie Foundation report (2011) that found undergraduate business education too narrow in focus and highlighted the need to integrate more liberal arts teaching into the curriculum. However, in the contemporary research-driven environment of business and management schools, historical understanding is unlikely to permeate the curriculum if historical analysis cannot first deliver significant theoretical contributions. This is the central theme around which this edited volume revolves, and it marks a milestone in this ongoing debate. (In the spirit of full disclosure, I should add that even though I did not contribute to this volume, I have coauthored with several of its contributors and view this book as central to my current research practice.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, focus groups, surveys, usability tests, case studies, diary studies, ethnography, contextual inquiry, experience sampling, and automated data collection. In this paper, we report on our experience using the evaluation methods focus groups, surveys and interviews and how we adopted these methods to develop artefacts: either interface’s design or information and technological systems. Four projects are examples of the different methods application to gather information about user’s wants, habits, practices, concerns and preferences. The goal was to build an understanding of the attitudes and satisfaction of the people who might interact with a technological artefact or information system. Conversely, we intended to design for information systems and technological applications, to promote resilience in organisations (a set of routines that allow to recover from obstacles) and user’s experiences. Organisations can here also be viewed within a system approach, which means that the system perturbations even failures could be characterized and improved. The term resilience has been applied to everything from the real estate, to the economy, sports, events, business, psychology, and more. In this study, we highlight that resilience is also made up of a number of different skills and abilities (self-awareness, creating meaning from other experiences, self-efficacy, optimism, and building strong relationships) that are a few foundational ingredients, which people should use along with the process of enhancing an organisation’s resilience. Resilience enhances knowledge of resources available to people confronting existing problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geophysical surveying and geoelectricalmethods are effective to study permafrost distribution and conditions in polar environments. Geoelectrical methods are particularly suited to study the spatial distribution of permafrost because of its high electrical resistivity in comparison with that of soil or rock above 0 °C. In the South Shetland Islands permafrost is considered to be discontinuous up to elevations of 20–40ma.s.l., changing to continuous at higher altitudes. There are no specific data about the distribution of permafrost in Byers Peninsula, in Livingston Island, which is the largest ice-free area in the South Shetland Islands. With the purpose of better understanding the occurrence of permanent frozen conditions in this area, a geophysical survey using an electrical resistivity tomography (ERT)methodologywas conducted during the January 2015 field season, combined with geomorphological and ecological studies. Three overlapping electrical resistivity tomographies of 78meach were done along the same profile which ran from the coast to the highest raised beaches. The three electrical resistivity tomographies are combined in an electrical resistivitymodel which represents the distribution of the electrical resistivity of the ground to depths of about 13malong 158m. Several patches of high electrical resistivity were found, and interpreted as patches of sporadic permafrost. The lower limits of sporadic to discontinuous permafrost in the area are confirmed by the presence of permafrost-related landforms nearby. There is a close correspondence between moss patches and permafrost patches along the geoelectrical transect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analytics is the technology working with the manipulation of data to produce information able to change the world we live every day. Analytics have been largely used within the last decade to cluster people’s behaviour to predict their preferences of items to buy, music to listen, movies to watch and even electoral preference. The most advanced companies succeded in controlling people’s behaviour using analytics. Despite the evidence of the super-power of analytics, they are rarely applied to the big data collected within supply chain systems (i.e. distribution network, storage systems and production plants). This PhD thesis explores the fourth research paradigm (i.e. the generation of knowledge from data) applied to supply chain system design and operations management. An ontology defining the entities and the metrics of supply chain systems is used to design data structures for data collection in supply chain systems. The consistency of this data is provided by mathematical demonstrations inspired by the factory physics theory. The availability, quantity and quality of the data within these data structures define different decision patterns. Ten decision patterns are identified, and validated on-field, to address ten different class of design and control problems in the field of supply chain systems research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microplastics have become ubiquitous pollutants in the marine environment. Ingestion of microplastics by a wide range of marine organisms has been recorded both in laboratory and field studies. Despite growing concern for microplastics, few studies have evaluated their concentrations and distribution in wild populations. Further, there is a need to identify cost-effective standardized methodologies for microplastics extraction and analysis in organisms. In this thesis I present: (i) the results of a multi-scale field sampling to quantify and characterize microplastics occurrence and distribution in 4 benthic marine invertebrates from saltmarshes along the North Adriatic Italian coastal lagoons; (ii) a comparison of the effects and cost-effectiveness of two extraction protocols for microplastics isolation on microfibers and on wild collected organisms; (iii) the development of a novel field- based technique to quantify and characterize the microplastic uptake rates of wild and farmed populations of mussels (Mytilus galloprovincialis) through the analysis of their biodeposits. I found very low and patchy amounts of microplastics in the gastrointestinal tracts of sampled organisms. The omnivorous crab Carcinus aestuarii was the species with the highest amounts of microplastics, but there was a notable variation among individuals. There were no substantial differences between enzymatic and alkaline extraction methods. However, the alkaline extraction was quicker and cheaper. Biodeposit traps proved to be an effective method to estimate mussel ingestion rates. However their performance differed significantly among sites, suggesting that the method, as currently designed, is sensible to local environmental conditions. There were no differences in the ingestion rates of microplastics between farmed and wild mussels. The estimates of microplastic ingestion and the validated procedures for their extraction provide a strong basis for future work on microplastic pollution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation aims at developing advanced analytical tools able to model surface waves propagating in elastic metasurfaces. In particular, four different objectives are defined and pursued throughout this work to enrich the description of the metasurface dynamics. First, a theoretical framework is developed to describe the dispersion properties of a seismic metasurface composed of discrete resonators placed on a porous medium considering part of it fully saturated. Such a model combines classical elasticity theory, Biot’s poroelasticity and an effective medium approach to describe the metasurface dynamics and its coupling with the poroelastic substrate. Second, an exact formulation based on the multiple scattering theory is developed to extend the two-dimensional classical Lamb’s problem to the case of an elastic half-space coupled to an arbitrary number of discrete surface resonators. To this purpose, the incident wavefield generated by a harmonic source and the scattered field generated by each resonator are calculated. The substrate wavefield is then obtained as solutions of the coupled problem due to the interference of the incident field and the multiple scattered fields of the oscillators. Third, the above discussed formulation is extended to three-dimensional contexts. The purpose here is to investigate the dynamic behavior and the topological properties of quasiperiodic elastic metasurfaces. Finally, the multiple scattering formulation is extended to model flexural metasurfaces, i.e., an array of thin plates. To this end, the resonant plates are modeled by means of their equivalent impedance, derived by exploiting the Kirchhoff plate theory. The proposed formulation permits the treatment of a general flexural metasurface, with no limitation on the number of plates and the configuration taken into account. Overall, the proposed analytical tools could pave the way for a better understanding of metasurface dynamics and their implementation in engineered devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of ancient, undeciphered scripts presents unique challenges, that depend both on the nature of the problem and on the peculiarities of each writing system. In this thesis, I present two computational approaches that are tailored to two different tasks and writing systems. The first of these methods is aimed at the decipherment of the Linear A afraction signs, in order to discover their numerical values. This is achieved with a combination of constraint programming, ad-hoc metrics and paleographic considerations. The second main contribution of this thesis regards the creation of an unsupervised deep learning model which uses drawings of signs from ancient writing system to learn to distinguish different graphemes in the vector space. This system, which is based on techniques used in the field of computer vision, is adapted to the study of ancient writing systems by incorporating information about sequences in the model, mirroring what is often done in natural language processing. In order to develop this model, the Cypriot Greek Syllabary is used as a target, since this is a deciphered writing system. Finally, this unsupervised model is adapted to the undeciphered Cypro-Minoan and it is used to answer open questions about this script. In particular, by reconstructing multiple allographs that are not agreed upon by paleographers, it supports the idea that Cypro-Minoan is a single script and not a collection of three script like it was proposed in the literature. These results on two different tasks shows that computational methods can be applied to undeciphered scripts, despite the relatively low amount of available data, paving the way for further advancement in paleography using these methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of new materials and their functions has always been a fundamental component of technological progress. Nowadays, the quest for new materials is stronger than ever: sustainability, medicine, robotics and electronics are all key assets which depend on the ability to create specifically tailored materials. However, designing materials with desired properties is a difficult task, and the complexity of the discipline makes it difficult to identify general criteria. While scientists developed a set of best practices (often based on experience and expertise), this is still a trial-and-error process. This becomes even more complex when dealing with advanced functional materials. Their properties depend on structural and morphological features, which in turn depend on fabrication procedures and environment, and subtle alterations leads to dramatically different results. Because of this, materials modeling and design is one of the most prolific research fields. Many techniques and instruments are continuously developed to enable new possibilities, both in the experimental and computational realms. Scientists strive to enforce cutting-edge technologies in order to make progress. However, the field is strongly affected by unorganized file management, proliferation of custom data formats and storage procedures, both in experimental and computational research. Results are difficult to find, interpret and re-use, and a huge amount of time is spent interpreting and re-organizing data. This also strongly limit the application of data-driven and machine learning techniques. This work introduces possible solutions to the problems described above. Specifically, it talks about developing features for specific classes of advanced materials and use them to train machine learning models and accelerate computational predictions for molecular compounds; developing method for organizing non homogeneous materials data; automate the process of using devices simulations to train machine learning models; dealing with scattered experimental data and use them to discover new patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine Learning makes computers capable of performing tasks typically requiring human intelligence. A domain where it is having a considerable impact is the life sciences, allowing to devise new biological analysis protocols, develop patients’ treatments efficiently and faster, and reduce healthcare costs. This Thesis work presents new Machine Learning methods and pipelines for the life sciences focusing on the unsupervised field. At a methodological level, two methods are presented. The first is an “Ab Initio Local Principal Path” and it is a revised and improved version of a pre-existing algorithm in the manifold learning realm. The second contribution is an improvement over the Import Vector Domain Description (one-class learning) through the Kullback-Leibler divergence. It hybridizes kernel methods to Deep Learning obtaining a scalable solution, an improved probabilistic model, and state-of-the-art performances. Both methods are tested through several experiments, with a central focus on their relevance in life sciences. Results show that they improve the performances achieved by their previous versions. At the applicative level, two pipelines are presented. The first one is for the analysis of RNA-Seq datasets, both transcriptomic and single-cell data, and is aimed at identifying genes that may be involved in biological processes (e.g., the transition of tissues from normal to cancer). In this project, an R package is released on CRAN to make the pipeline accessible to the bioinformatic Community through high-level APIs. The second pipeline is in the drug discovery domain and is useful for identifying druggable pockets, namely regions of a protein with a high probability of accepting a small molecule (a drug). Both these pipelines achieve remarkable results. Lastly, a detour application is developed to identify the strengths/limitations of the “Principal Path” algorithm by analyzing Convolutional Neural Networks induced vector spaces. This application is conducted in the music and visual arts domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In medicine, innovation depends on a better knowledge of the human body mechanism, which represents a complex system of multi-scale constituents. Unraveling the complexity underneath diseases proves to be challenging. A deep understanding of the inner workings comes with dealing with many heterogeneous information. Exploring the molecular status and the organization of genes, proteins, metabolites provides insights on what is driving a disease, from aggressiveness to curability. Molecular constituents, however, are only the building blocks of the human body and cannot currently tell the whole story of diseases. This is why nowadays attention is growing towards the contemporary exploitation of multi-scale information. Holistic methods are then drawing interest to address the problem of integrating heterogeneous data. The heterogeneity may derive from the diversity across data types and from the diversity within diseases. Here, four studies conducted data integration using customly designed workflows that implement novel methods and views to tackle the heterogeneous characterization of diseases. The first study devoted to determine shared gene regulatory signatures for onco-hematology and it showed partial co-regulation across blood-related diseases. The second study focused on Acute Myeloid Leukemia and refined the unsupervised integration of genomic alterations, which turned out to better resemble clinical practice. In the third study, network integration for artherosclerosis demonstrated, as a proof of concept, the impact of network intelligibility when it comes to model heterogeneous data, which showed to accelerate the identification of new potential pharmaceutical targets. Lastly, the fourth study introduced a new method to integrate multiple data types in a unique latent heterogeneous-representation that facilitated the selection of important data types to predict the tumour stage of invasive ductal carcinoma. The results of these four studies laid the groundwork to ease the detection of new biomarkers ultimately beneficial to medical practice and to the ever-growing field of Personalized Medicine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation analyzes the exploitation of the orbital angular momentum (OAM) of the electromagnetic waves with large intelligent surfaces in the near-field region and line-of-sight conditions, in light of the holographic MIMO communication concept. Firstly, a characterization of the OAM-based communication problem is presented, and the relationship between OAM-carrying waves and communication modes is discussed. Then, practicable strategies for OAM detection using large intelligent surfaces and optimization methods based on beam focusing are proposed. Numerical results characterize the effectiveness of OAM with respect to other strategies, also including the proposed detection and optimization methods. It is shown that OAM waves constitute a particular choice of communication modes, i.e., an alternative basis set, which is sub-optimum with respect to optimal basis functions that can be derived by solving eigenfunction problems. Moreover, even the joint utilization of OAM waves with focusing strategies led to the conclusion that no channel capacity achievements can be obtained with these transmission techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following thesis aims to investigate the issues concerning the maintenance of a Machine Learning model over time, both about the versioning of the model itself and the data on which it is trained and about data monitoring tools and their distribution. The themes of Data Drift and Concept Drift were then explored and the performance of some of the most popular techniques in the field of Anomaly detection, such as VAE, PCA, and Monte Carlo Dropout, were evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this clinical study was to determine the efficacy of Uncaria tomentosa (cat's claw) against denture stomatitis (DS). Fifty patients with DS were randomly assigned into 3 groups to receive 2% miconazole, placebo, or 2% U tomentosa gel. DS level was recorded immediately, after 1 week of treatment, and 1 week after treatment. The clinical effectiveness of each treatment was measured using Newton's criteria. Mycologic samples from palatal mucosa and prosthesis were obtained to determinate colony forming units per milliliter (CFU/mL) and fungal identification at each evaluation period. Candida species were identified with HiCrome Candida and API 20C AUX biochemical test. DS severity decreased in all groups (P < .05). A significant reduction in number of CFU/mL after 1 week (P < .05) was observed for all groups and remained after 14 days (P > .05). C albicans was the most prevalent microorganism before treatment, followed by C tropicalis, C glabrata, and C krusei, regardless of the group and time evaluated. U tomentosa gel had the same effect as 2% miconazole gel. U tomentosa gel is an effective topical adjuvant treatment for denture stomatitis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Very high field (29)Si-NMR measurements using a fully (29)Si-enriched URu(2)Si(2) single crystal were carried out in order to microscopically investigate the hidden order (HO) state and adjacent magnetic phases in the high field limit. At the lowest measured temperature of 0.4 K, a clear anomaly reflecting a Fermi surface instability near 22 T inside the HO state is detected by the (29)Si shift, (29)K(c). Moreover, a strong enhancement of (29)K(c) develops near a critical field H(c) ≃ 35.6 T, and the ^{29}Si-NMR signal disappears suddenly at H(c), indicating the total suppression of the HO state. Nevertheless, a weak and shifted (29)Si-NMR signal reappears for fields higher than H(c) at 4.2 K, providing evidence for a magnetic structure within the magnetic phase caused by the Ising-type anisotropy of the uranium ordered moments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.