900 resultados para Canadian Experiment
Resumo:
This work aims to shed some light on longshore sediment transport (LST) in the highly energetic northwest coast of Portugal. Data achieved through a sand-tracer experiment are compared with data obtained from the original and the new re-evaluated longshore sediment transport formulas (USACE Waterways Experiment Station’s Coastal Engineering and Research Center, Kamphuis, and Bayram bulk formulas) to assess their performance. The field experiment with dyed sand was held at Ofir Beach during one tidal cycle under medium wave-energy conditions. Local hydrodynamic conditions and beach topography were recorded. The tracer was driven southward in response to the local swell and wind- and wave-induced currents (Hsb=0.75mHsb=0.75m, Tp=11.5sTp=11.5s, θb=8−12°θb=8−12°). The LST was estimated by using a linear sediment transport flux approach. The obtained value (2.3×10−3m3⋅s−12.3×10−3m3⋅s−1) approached the estimation provided by the original Bayram formula (2.5×10−3m3⋅s−12.5×10−3m3⋅s−1). The other formulas overestimated the transport, but the estimations resulting from the new re-evaluated formulas also yield approximate results. Therefore, the results of this work indicated that the Bayram formula may give satisfactory results for predicting the longshore sediment transport on Ofir Beach.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
The limitations of access to finance in Africa, together with the recent boom in cell phone use in that continent, created high expectations regarding the introduction of mobile money in many African countries. The success story of M-PESA in Kenya raised the bar further. We designed and conducted a field experiment to assess the impact of randomized mobile money dissemination in rural Mozambique. For this purpose we benefit from the fact that mobile money was only recently launched in the country, allowing for the identification of a pure control group. This paper reports on the first results of this ongoing project after the first wave of dissemination efforts in rural locations, which included the recruitment and training of mobile money agents, community meetings and theaters, as well as individual rural campaigning. Administrative and behavioral data both show clear adherence to the services in the treatment group. Financial literacy and trust outcomes are also positively affected by the treatment. We present behavioral evidence that the marginal willingness to remit was increased by the availability of mobile money. Finally, we observe a tendency for mobile money to substitute traditional alternatives for both savings and remittances.
Resumo:
We investigate the determinants of giving in a lab-in-the-field experiment with large stakes. Study participants in urban Mozambique play dictator games where their counterpart is the closest person to them outside their household. Dictators share more with counterparts when they have the option of giving in kind (in the form of goods), compared to giving that must be in cash. Qualitative post-experiment responses suggest that this effect is driven by a desire to control how recipients use gifted resources. Standard economic determinants such as the rate of return to giving and the size of the endowment also affect giving, but the effects of even large changes in these determinants are significantly smaller than the effect of the in-kind option. Our results support theories of giving where the utility of givers depends on the composition (not just the level) of gift-recipient expenditures, and givers thus seek control over transferred resources.
Resumo:
Vitamin A deficiency is a widespread public health problem in Sub-Saharan Africa. This paper analyzes the impact of a food-based intervention to fight vitamin A deficiency using orange-fleshed sweet potato (OFSP). We conducted a randomized evaluation of OFSP-related training to female farmers in Mozambique, in which the treatment group was taught basic concepts of nutrition, and OFSP-planting and cooking skills. We found encouraging evidence of changes in behavior and attitudes towards OFSP consumption and planting, and considerable increases in nutrition-related knowledge, as well as knowledge on cooking and planting OFSP.
Resumo:
Do information flows matter for remittance behavior? We design and implement a randomized control trial to quantitatively assess the role of communication between migrants and their contacts abroad on the extent and value of remittance flows. In the experiment, a random sample of 1,500 migrants residing in Ireland was offered the possibility of contacting their networks outside the host country for free over a varying number of months. We find a sizable, positive impact of our intervention on the value of migrant remittances sent. Our results exclude that the remittance effect we identify is a simple substitution effect. Instead, our analysis points to this effect being a likely result of improved information via factors such as better migrant control over remittance use, enhanced trust in remittance channels due to experience sharing, or increased remittance recipients’ social pressure on migrants.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics
Resumo:
In this research we conducted a mixed research, using qualitative and quantitative analysis to study the relationship and impact between mobile advertisement and mobile app user acquisition and the conclusions companies can derive from it. Data was gathered from management of mobile advertisement campaigns of a portfolio of three different mobile apps. We found that a number of implications can be extracted from this intersection, namely to product development, internationalisation and management of marketing budget. We propose further research on alternative app users sources, impact of revenue on apps and exploitation of product segments: wearable technology and Internet of Things.
Resumo:
We are living in the era of Big Data. A time which is characterized by the continuous creation of vast amounts of data, originated from different sources, and with different formats. First, with the rise of the social networks and, more recently, with the advent of the Internet of Things (IoT), in which everyone and (eventually) everything is linked to the Internet, data with enormous potential for organizations is being continuously generated. In order to be more competitive, organizations want to access and explore all the richness that is present in those data. Indeed, Big Data is only as valuable as the insights organizations gather from it to make better decisions, which is the main goal of Business Intelligence. In this paper we describe an experiment in which data obtained from a NoSQL data source (database technology explicitly developed to deal with the specificities of Big Data) is used to feed a Business Intelligence solution.
Resumo:
A search for a charged Higgs boson, H±, decaying to a W± boson and a Z boson is presented. The search is based on 20.3 fb−1 of proton-proton collision data at a center-of-mass energy of 8 TeV recorded with the ATLAS detector at the LHC. The H± boson is assumed to be produced via vector-boson fusion and the decays W±→qq′¯ and Z→e+e−/μ+μ− are considered. The search is performed in a range of charged Higgs boson masses from 200 to 1000 GeV. No evidence for the production of an H± boson is observed. Upper limits of 31--1020 fb at 95% CL are placed on the cross section for vector-boson fusion production of an H± boson times its branching fraction to W±Z. The limits are compared with predictions from the Georgi-Machacek Higgs Triplet Model.
Resumo:
A search for the decay to a pair of new particles of either the 125 GeV Higgs boson (h) or a second CP-even Higgs boson (H) is presented. The dataset correspods to an integrated luminosity of 20.3 fb−1 of pp collisions at s√= 8 TeV recorded by the ATLAS experiment at the LHC in 2012. The search was done in the context of the next-to-minimal supersymmetric standard model, in which the new particles are the lightest neutral pseudoscalar Higgs bosons (a). One of the two a bosons is required to decay to two muons while the other is required to decay to two τ-leptons. No significant excess is observed above the expected backgrounds in the dimuon invariant mass range from 3.7 GeV to 50 GeV. Upper limits are placed on the production of h→aa relative to the Standard Model gg→h production, assuming no coupling of the a boson to quarks. The most stringent limit is placed at 3.5% for ma= 3.75 GeV. Upper limits are also placed on the production cross section of H→aa from 2.33 pb to 0.72 pb, for fixed ma = 5 GeV with mH ranging from 100 GeV to 500 GeV.
Resumo:
This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.
Resumo:
Biofilm research is growing more diverse and dependent on high-throughput technologies and the large-scale production of results aggravates data substantiation. In particular, it is often the case that experimental protocols are adapted to meet the needs of a particular laboratory and no statistical validation of the modified method is provided. This paper discusses the impact of intra-laboratory adaptation and non-rigorous documentation of experimental protocols on biofilm data interchange and validation. The case study is a non-standard, but widely used, workflow for Pseudomonas aeruginosa biofilm development, considering three analysis assays: the crystal violet (CV) assay for biomass quantification, the XTT assay for respiratory activity assessment, and the colony forming units (CFU) assay for determination of cell viability. The ruggedness of the protocol was assessed by introducing small changes in the biofilm growth conditions, which simulate minor protocol adaptations and non-rigorous protocol documentation. Results show that even minor variations in the biofilm growth conditions may affect the results considerably, and that the biofilm analysis assays lack repeatability. Intra-laboratory validation of non-standard protocols is found critical to ensure data quality and enable the comparison of results within and among laboratories.
Resumo:
The normalized differential cross section for top-quark pair production in association with at least one jet is studied as a function of the inverse of the invariant mass of the tt¯+1-jet system. This distribution can be used for a precise determination of the top-quark mass since gluon radiation depends on the mass of the quarks. The experimental analysis is based on proton--proton collision data collected by the ATLAS detector at the LHC with a centre-of-mass energy of 7 TeV corresponding to an integrated luminosity of 4.6 fb−1. The selected events were identified using the lepton+jets top-quark-pair decay channel, where lepton refers to either an electron or a muon. The observed distribution is compared to a theoretical prediction at next-to-leading-order accuracy in quantum chromodynamics using the pole-mass scheme. With this method, the measured value of the top-quark pole mass, mpolet, is: mpolet =173.7 ± 1.5 (stat.) ± 1.4 (syst.) +1.0−0.5 (theory) GeV. This result represents the most precise measurement of the top-quark pole mass to date.