936 resultados para indexing consistency
Resumo:
Aquest és un projecte sobre la indexació de continguts televisius; és a dir, el procés d’etiquetatge de programes televisius per facilitar cerques segons diferents paràmetres. El món de la televisió està immers en un procés d'evolució i canvis gràcies a l'entrada de la televisió digital. Aquesta nova forma d'entendre la televisió obrirà un gran ventall de possibilitats i permetrà la interacció entre usuaris i emissora. El primer pas de la gestió de continguts consisteix en la indexació dels programes segons el contingut. Aquest és el nostre objectiu. Indexar els continguts televisius de manera automàtica mitjançant la intelligència artificial.
Resumo:
Generalized multiresolution analyses are increasing sequences of subspaces of a Hilbert space H that fail to be multiresolution analyses in the sense of wavelet theory because the core subspace does not have an orthonormal basis generated by a fixed scaling function. Previous authors have studied a multiplicity function m which, loosely speaking, measures the failure of the GMRA to be an MRA. When the Hilbert space H is L2(Rn), the possible multiplicity functions have been characterized by Baggett and Merrill. Here we start with a function m satisfying a consistency condition which is known to be necessary, and build a GMRA in an abstract Hilbert space with multiplicity function m.
Resumo:
Report for the scientific sojourn at the University of Linköping between April to July 2007. Monitoring of the air intake system of an automotive engine is important to meet emission related legislative diagnosis requirements. During the research the problem of fault detection in the air intake system was stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem was solved using Interval-based Consistency Techniques. Interval-based consistency techniques are shown to be particularly efficient for checking the consistency of the Analytical Redundancy Relations (ARRs), dealing with uncertain measurements and parameters, and using experimental data. All experiments were performed on a four-cylinder turbo-charged spark-ignited SAAB engine located in the research laboratory at Vehicular System Group - University of Linköping.
Resumo:
Recent attempts to incorporate optimal fiscal policy into New Keynesian models subject to nominal inertia, have tended to assume that policy makers are benevolent and have access to a commitment technology. A separate literature, on the New Political Economy, has focused on real economies where there is strategic use of policy instruments in a world of political conflict. In this paper we combine these literatures and assume that policy is set in a New Keynesian economy by one of two policy makers facing electoral uncertainty (in terms of infrequent elections and an endogenous voting mechanism). The policy makers generally share the social welfare function, but differ in their preferences over fiscal expenditure (in its size and/or composition). Given the environment, policy shall be realistically constrained to be time-consistent. In a sticky-price economy, such heterogeneity gives rise to the possibility of one policy maker utilising (nominal) debt strategically to tie the hands of the other party, and influence the outcome of any future elections. This can give rise to a deficit bias, implying a sub-optimally high level of steady-state debt, and can also imply a sub-optimal response to shocks. The steady-state distortions and inflation bias this generates, combined with the volatility induced by the electoral cycle in a sticky-price environment, can significantly
Resumo:
If choices depend on the decision maker's mood, is the attempt to derive any consistency in choice doomed? In this paper we argue that, even with full unpredictability of mood, the way choices from a menu relate to choices from another menu exhibits some structure. We present two alternative models of 'moody choice' and show that, in either of them, not all choice patterns are possible. Indeed, we characterise both models in terms of consistency requirements of the observed choice data.
Resumo:
Estudi realitzat a partir d’una estada al Physics Department de la New York University, United States, Estats Units, entre 2006 i 2008. Una de les observacions de més impacte en la cosmologia moderna ha estat la determinació empírica que l’Univers es troba actualment en una fase d’Expansió Accelerada (EA). Aquest fenòmen implica que o bé l’Univers està dominat per un nou sector de matèria/energia, o bé la Relativitat General deixa de tenir validesa a escales cosmològiques. La primera possibilitat comprèn els models d’Energia Fosca (EF), i el seu principal problema és que l’EF ha de tenir propietats tan especials que es fan difícils de justificar teòricament. La segona possibilitat requereix la construcció de teories consistents de Gravetat Modificada a Grans Distàncies (GMGD), que són una generalització dels models de gravetat massiva. L’interès fenomenològic per aquestes teories també va resorgir amb l’aparició dels primers exemples de models de GMGD, com ara el model de Dvali, Gabadadze i Porrati (DGP), que consisteix en un tipus de brana en una dimensió extra. Malauradament, però, aquest model no permet explicar de forma consistent l’EA de l’Univers. Un dels objectius d’aquest projecte ha estat establir la viabilitat interna i fenomenològica dels models de GMGD. Des del punt de vista fenomenològic, ens hem centrat en la questió més important a la pràctica: trobar signatures observacionals que permetin distingir els models de GMGD dels d’EF. A nivell més teòric, també hem investigat el significat de les inestabilitats del model DGP.L’altre gran objectiu que ens vam proposar va ser la construcció de noves teories de GMGD. En la segona part d’aquest projecte, hem elaborat i mostrat la consistència del model “DGP en Cascada”, que generalitza el model DGP a més dimensions extra, i representa el segon model consistent i invariant-Lorentz a l’espai pla conegut. L’existència d’altres models de GMGD més enllà de DGP és de gran interès atès que podria permetre obtenir l’EA de l’Univers de forma purament geomètrica.
Resumo:
This paper revisits the argument that the stabilisation bias that arises under discretionary monetary policy can be reduced if policy is delegated to a policymaker with redesigned objectives. We study four delegation schemes: price level targeting, interest rate smoothing, speed limits and straight conservatism. These can all increase social welfare in models with a unique discretionary equilibrium. We investigate how these schemes perform in a model with capital accumulation where uniqueness does not necessarily apply. We discuss how multiplicity arises and demonstrate that no delegation scheme is able to eliminate all potential bad equilibria. Price level targeting has two interesting features. It can create a new equilibrium that is welfare dominated, but it can also alter equilibrium stability properties and make coordination on the best equilibrium more likely.
Resumo:
This paper studies the behavior of a central bank that seeks to conduct policy optimally while having imperfect credibility and harboring doubts about its model. Taking the Smets-Wouters model as the central bank.s approximating model, the paper's main findings are as follows. First, a central bank.s credibility can have large consequences for how policy responds to shocks. Second, central banks that have low credibility can bene.t from a desire for robustness because this desire motivates the central bank to follow through on policy announcements that would otherwise not be time-consistent. Third, even relatively small departures from perfect credibility can produce important declines in policy performance. Finally, as a technical contribution, the paper develops a numerical procedure to solve the decision-problem facing an imperfectly credible policymaker that seeks robustness.
Resumo:
This paper presents an axiomatic characterization of difference-form group contests, that is, contests fought among groups and where their probability of victory depends on the difference of their effective efforts. This axiomatization rests on the property of Equalizing Consistency, stating that the difference between winning probabilities in the grand contest and in the smaller contest should be identical across all participants in the smaller contest. This property overcomes some of the drawbacks of the widely-used ratio-form contest success functions. Our characterization shows that the criticisms commonly-held against difference-form contests success functions, such as lack of scale invariance and zero elasticity of augmentation, are unfounded.By clarifying the properties of this family of contest success functions, this axiomatization can help researchers to find the functional form better suited to their application of interest.
Resumo:
In the context of the two-stage threshold model of decision making, with the agent’s choices determined by the interaction Of three “structural variables,” we study the restrictions on behavior that arise when one or more variables are xogenously known. Our results supply necessary and sufficient conditions for consistency with the model for all possible states of partial Knowledge, and for both single- and multivalued choice functions.
Resumo:
Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.
Resumo:
OBJECTIVES: Advances in biopsychosocial science have underlined the importance of taking social history and life course perspective into consideration in primary care. For both clinical and research purposes, this study aims to develop and validate a standardised instrument measuring both material and social deprivation at an individual level. METHODS: We identified relevant potential questions regarding deprivation using a systematic review, structured interviews, focus group interviews and a think-aloud approach. Item response theory analysis was then used to reduce the length of the 38-item questionnaire and derive the deprivation in primary care questionnaire (DiPCare-Q) index using data obtained from a random sample of 200 patients during their planned visits to an ambulatory general internal medicine clinic. Patients completed the questionnaire a second time over the phone 3 days later to enable us to assess reliability. Content validity of the DiPCare-Q was then assessed by 17 general practitioners. Psychometric properties and validity of the final instrument were investigated in a second set of patients. The DiPCare-Q was administered to a random sample of 1898 patients attending one of 47 different private primary care practices in western Switzerland along with questions on subjective social status, education, source of income, welfare status and subjective poverty. RESULTS: Deprivation was defined in three distinct dimensions: material (eight items), social (five items) and health deprivation (three items). Item consistency was high in both the derivation (Kuder-Richardson Formula 20 (KR20) =0.827) and the validation set (KR20 =0.778). The DiPCare-Q index was reliable (interclass correlation coefficients=0.847) and was correlated to subjective social status (r(s)=-0.539). CONCLUSION: The DiPCare-Q is a rapid, reliable and validated instrument that may prove useful for measuring both material and social deprivation in primary care.
Resumo:
This paper develops a new test of true versus spurious long memory, based on log-periodogram estimation of the long memory parameter using skip-sampled data. A correction factor is derived to overcome the bias in this estimator due to aliasing. The procedure is designed to be used in the context of a conventional test of significance of the long memory parameter, and composite test procedure described that has the properties of known asymptotic size and consistency. The test is implemented using the bootstrap, with the distribution under the null hypothesis being approximated using a dependent-sample bootstrap technique to approximate short-run dependence following fractional differencing. The properties of the test are investigated in a set of Monte Carlo experiments. The procedure is illustrated by applications to exchange rate volatility and dividend growth series.
Resumo:
AIMS - To pilot the implementation of brief motivational intervention (BMI) among conscripts, and to test the effectiveness of BMI in young men voluntarily showing up for a single face-to-face alcohol BMI session. Participants were conscripts attending the army recruitment process in Lausanne. This process is mandatory for all Swiss males at age 19 and Lausanne serves all francophone Swiss men. METHODS - Of 3'227 young men that were seen during the army recruitment procedures, 445 voluntarily showed up for a BMI and 367 were included in the study (exclusions were random and unsystematic and related to organizational aspects in the recruitment center). After an initial assessment, subjects were randomized into two groups: an immediate BMI and a 6-month delayed BMI (waiting list design). A 6-month follow-up assessment was conducted in both groups. BMI was a face-to-face 20 minutes counseling session with a psychologist trained in motivational interviewing at baseline and a telephone session for the control group at follow-up. Strategies of BMI included the exploration and evocation of a possible behavior change, importance of future change, readiness to change, and commitment to change. A filmed example of such an intervention is available in French at www.alcoologie.ch. RESULTS - All procedures are now fully implemented and working and the provision of preventive efforts found general approval by the army. 3'227 were eligible for BMI and 445 of them (13.8%) showed up for receiving a BMI. 367 were included in the study, 181 in the BMI group and 186 in the control group. More than 86% of those included were reached at follow-up. With one exception all findings on alcohol use went in the expected direction, i.e. a stronger decrease in alcohol use (or a smaller increase as for usual weekly drinking amount) in the BMI group. The risk for risky single occasion drinking (RSOD) decreased from 57% at-risk users at baseline to 50.6%, i.e. a 6.4% point decrease in the BMI group, while there was only a 0.6% point decrease (from 57.5% to 56.9%) in the control group. Moreover, the study showed that there was a likelihood of crossover effects for other substances like tobacco smoking and cannabis use. Despite these encouraging and consistent positive findings, none reached significance at conventional levels (p < 0.05). DISCUSSION - Data suggest a beneficial impact of BMI on alcohol use outcomes and potential effect on other substance use in 19-year old men attending the army recruitment and showing up voluntarily for BMI. As the main aim was to implement and test feasibility of conducting BMI in this setting none of our findings reached statistical significance. The consistency of findings across measures and substances, however, raises hope that non-significance in the present study does not mean no effect, but mainly insufficient power of this pilot study. [Authors]
Resumo:
Satellite remote sensing imagery is used for forestry, conservation and environmental applications, but insufficient spatial resolution, and, in particular, unavailability of images at the precise timing required for a given application, often prevent achieving a fully operational stage. Airborne remote sensing has the advantage of custom-tuned sensors, resolution and timing, but its price prevents using it as a routine technique for the mentioned fields. Some Unmanned Aerial Vehicles might provide a “third way” solution as low-cost techniques for acquiring remotely sensed information, under close control of the end-user, albeit at the expense of lower quality instrumentation and instability. This report evaluates a light remote sensing system based on a remotely-controlled mini-UAV (ATMOS-3) equipped with a color infra-red camera (VEGCAM-1) designed and operated by CATUAV. We conducted a testing mission over a Mediterranean landscape dominated by an evergreen woodland of Aleppo pine (Pinus halepensis) and (Holm) oak (Quercus ilex) in the Montseny National Park (Catalonia, NE Spain). We took advantage of state-of-the-art ortho-rectified digital aerial imagery (acquired by the Institut Cartogràfic de Catalunya over the area during the previous year) and used it as quality reference. In particular, we paid attention to: 1) Operationality of flight and image acquisition according to a previously defined plan; 2) Radiometric and geometric quality of the images; and 3) Operational use of the images in the context of applications. We conclude that the system has achieved an operational stage regarding flight activities, although with meteorological limits set by wind speed and turbulence. Appropriate landing areas can be sometimes limiting also, but the system is able to land on small and relatively rough terrains such as patches of grassland or short matorral, and we have operated the UAV as far as 7 km from the control unit. Radiometric quality is sufficient for interactive analysis, but probably insufficient for automated processing. A forthcoming camera is supposed to greatly improve radiometric quality and consistency. Conventional GPS positioning through time synchronization provides coarse orientation of the images, with no roll information.