8 resultados para Information technology in agriculture
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
Chapter 1 studies how consumers’ switching costs affect the pricing and profits of firms competing in two-sided markets such as Apple and Google in the smartphone market. When two-sided markets are dynamic – rather than merely static – I show that switching costs lower the first-period price if network externalities are strong, which is in contrast to what has been found in one-sided markets. By contrast, switching costs soften price competition in the initial period if network externalities are weak and consumers are more patient than the platforms. Moreover, an increase in switching costs on one side decreases the first-period price on the other side. Chapter 2 examines firms’ incentives to invest in local and flexible resources when demand is uncertain and correlated. I find that market power of the monopolist providing flexible resources distorts investment incentives, while competition mitigates them. The extent of improvement depends critically on demand correlation and the cost of capacity: under social optimum and monopoly, if the flexible resource is cheap, the relationship between investment and correlation is positive, and if it is costly, the relationship becomes negative; under duopoly, the relationship is positive. The analysis also sheds light on some policy discussions in markets such as cloud computing. Chapter 3 develops a theory of sequential investments in cybersecurity. The regulator can use safety standards and liability rules to increase security. I show that the joint use of an optimal standard and a full liability rule leads to underinvestment ex ante and overinvestment ex post. Instead, switching to a partial liability rule can correct the inefficiencies. This suggests that to improve security, the regulator should encourage not only firms, but also consumers to invest in security.
Resumo:
The thesis aims to make the dynamics of the tradeoffs involving privacy more visible; both theoretically and in two of the central current policy debates in European data protection law, the right to be forgotten and online tracking. In doing so, it offers an explanation for data protection law from an economic perspective and provides a basis for the evaluation of further data protection measures.
Resumo:
This dissertation contributes to the scholarly debate on temporary teams by exploring team interactions and boundaries.The fundamental challenge in temporary teams originates from temporary participation in the teams. First, as participants join the team for a short period of time, there is not enough time to build trust, share understanding, and have effective interactions. Consequently, team outputs and practices built on team interactions become vulnerable. Secondly, as team participants move on and off the teams, teams’ boundaries become blurred over time. It leads to uncertainty among team participants and leaders about who is/is not identified as a team member causing collective disagreement within the team. Focusing on the above mentioned challenges, we conducted this research in healthcare organisations since the use of temporary teams in healthcare and hospital setting is prevalent. In particular, we focused on orthopaedic teams that provide personalised treatments for patients using 3D printing technology. Qualitative and quantitative data were collected using interviews, observations, questionnaires and archival data at Rizzoli Orthopaedic Institute, Bologna, Italy. This study provides the following research outputs. The first is a conceptual study that explores temporary teams’ literature using bibliometric analysis and systematic literature review to highlight research gaps. The second paper qualitatively studies temporary relationships within the teams by collecting data using group interviews and observations. The results highlighted the role of short-term dyadic relationships as a ground to share and transfer knowledge at the team level. Moreover, hierarchical structure of the teams facilitates knowledge sharing by supporting dyadic relationships within and beyond the team meetings. The third paper investigates impact of blurred boundaries on temporary teams’ performance. Using quantitative data collected through questionnaires and archival data, we concluded that boundary blurring in terms of fluidity, overlap and dispersion differently impacts team performance at high and low levels of task complexity.
Resumo:
The thesis deals with the problem of Model Selection (MS) motivated by information and prediction theory, focusing on parametric time series (TS) models. The main contribution of the thesis is the extension to the multivariate case of the Misspecification-Resistant Information Criterion (MRIC), a criterion introduced recently that solves Akaike’s original research problem posed 50 years ago, which led to the definition of the AIC. The importance of MS is witnessed by the huge amount of literature devoted to it and published in scientific journals of many different disciplines. Despite such a widespread treatment, the contributions that adopt a mathematically rigorous approach are not so numerous and one of the aims of this project is to review and assess them. Chapter 2 discusses methodological aspects of MS from information theory. Information criteria (IC) for the i.i.d. setting are surveyed along with their asymptotic properties; and the cases of small samples, misspecification, further estimators. Chapter 3 surveys criteria for TS. IC and prediction criteria are considered for: univariate models (AR, ARMA) in the time and frequency domain, parametric multivariate (VARMA, VAR); nonparametric nonlinear (NAR); and high-dimensional models. The MRIC answers Akaike’s original question on efficient criteria, for possibly-misspecified (PM) univariate TS models in multi-step prediction with high-dimensional data and nonlinear models. Chapter 4 extends the MRIC to PM multivariate TS models for multi-step prediction introducing the Vectorial MRIC (VMRIC). We show that the VMRIC is asymptotically efficient by proving the decomposition of the MSPE matrix and the consistency of its Method-of-Moments Estimator (MoME), for Least Squares multi-step prediction with univariate regressor. Chapter 5 extends the VMRIC to the general multiple regressor case, by showing that the MSPE matrix decomposition holds, obtaining consistency for its MoME, and proving its efficiency. The chapter concludes with a digression on the conditions for PM VARX models.
Resumo:
In recent decades, the use of organic fertilizers has gained increasing interest mainly for two reasons: their ability to improve soil fertility and the need to find a sustainable alternative to mineral and synthetic fertilizers. In this context, sewage sludge is a useful organic matrix that can be successfully used in agriculture, due to its chemical composition rich in organic matter, nitrogen, phosphorus and other micronutrients necessary for plant growth. This work investigated three indispensable aspects (i.e., physico-chemical properties, agronomic efficiency and environmental safety) of sewage sludge application as organic fertilizer, emphasizing the role of tannery sludge. In a comparison study with municipal sewage sludge, results showed that the targeted analyses applied (total carbon and nitrogen content, isotope ratio of carbon and nitrogen, infrared spectroscopy and thermal analysis) were able to discriminate tannery sludge from municipal ones, highlighting differences in composition due to the origin of the wastewater and the treatment processes used in the plants. Regarding agronomic efficiency, N bioavailability was tested in a selection of organic fertilizers, including tannery sludge and tannery sludge-based fertilizers. Specifically, the hot-water extractable N has proven to be a good chemical indicator, providing a rapid and reliable indication of N bioavailability in soil. Finally, the behavior of oxybenzone (an emerging organic contaminant detected in sewage sludge) in soils with different physico-chemical properties was studied. Through adsorption and desorption experiments, it was found that the mobility of oxybenzone is reduced in soils rich in organic matter. Furthermore, through spectroscopic methods (e.g., infrared spectroscopy and surface-enhanced Raman spectroscopy) the mechanisms of oxybenzone-humic acids interaction were studied, finding that H-bonds and π-π stacking were predominantly present.
Resumo:
This thesis investigates how individuals can develop, exercise, and maintain autonomy and freedom in the presence of information technology. It is particularly interested in how information technology can impose autonomy constraints. The first part identifies a problem with current autonomy discourse: There is no agreed upon object of reference when bemoaning loss of or risk to an individual’s autonomy. Here, thesis introduces a pragmatic conceptual framework to classify autonomy constraints. In essence, the proposed framework divides autonomy in three categories: intrinsic autonomy, relational autonomy and informational autonomy. The second part of the thesis investigates the role of information technology in enabling and facilitating autonomy constraints. The analysis identifies eleven characteristics of information technology, as it is embedded in society, so-called vectors of influence, that constitute risk to an individual’s autonomy in a substantial way. These vectors are assigned to three sets that correspond to the general sphere of the information transfer process to which they can be attributed to, namely domain-specific vectors, agent-specific vectors and information recipient-specific vectors. The third part of the thesis investigates selected ethical and legal implications of autonomy constraints imposed by information technology. It shows the utility of the theoretical frameworks introduced earlier in the thesis when conducting an ethical analysis of autonomy-constraining technology. It also traces the concept of autonomy in the European Data Lawsand investigates the impact of cultural embeddings of individuals on efforts to safeguard autonomy, showing intercultural flashpoints of autonomy differences. In view of this, the thesis approaches the exercise and constraint of autonomy in presence of information technology systems holistically. It contributes to establish a common understanding of (intuitive) terminology and concepts, connects this to current phenomena arising out of ever-increasing interconnectivity and computational power and helps operationalize the protection of autonomy through application of the proposed frameworks.