902 resultados para Real state credit
Resumo:
Body area networks (BANs) are emerging as enabling technology for many human-centered application domains such as health-care, sport, fitness, wellness, ergonomics, emergency, safety, security, and sociality. A BAN, which basically consists of wireless wearable sensor nodes usually coordinated by a static or mobile device, is mainly exploited to monitor single assisted livings. Data generated by a BAN can be processed in real-time by the BAN coordinator and/or transmitted to a server-side for online/offline processing and long-term storing. A network of BANs worn by a community of people produces large amount of contextual data that require a scalable and efficient approach for elaboration and storage. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of body sensor data streams. In this paper, we motivate the introduction of Cloud-assisted BANs along with the main challenges that need to be addressed for their development and management. The current state-of-the-art is overviewed and framed according to the main requirements for effective Cloud-assisted BAN architectures. Finally, relevant open research issues in terms of efficiency, scalability, security, interoperability, prototyping, dynamic deployment and management, are discussed.
Resumo:
This paper investigates how political theorists and philosophers should understand egalitarian political demands in light of the increasingly important realist critique of much of contemporary political theory and philosophy. It suggests, first, that what Martin O'Neill has called non-intrinsic egalitarianism is, in one form at least, a potentially realistic egalitarian political project and second, that realists may be compelled to impose an egalitarian threshold on state claims to legitimacy under certain circumstances. Non-intrinsic egalitarianism can meet realism’s methodological requirements because it does not have to assume an unavailable moral consensus since it can focus on widely acknowledged bads rather than contentious claims about the good. Further, an appropriately formulated non-intrinsic egalitarianism may be a minimum requirement of an appropriately realistic claim by a political order to authoritatively structure some of its members' lives. Without at least a threshold set of egalitarian commitments, a political order seems unable to be transparent to many of its worse off members under a plausible construal of contemporary conditions.
Resumo:
Trading commercial real estate involves a process of exchange that is costly and which occurs over an extended and uncertain period of time. This has consequences for the performance and risk of real estate investments. Most research on transaction times has occurred for residential rather than commercial real estate. We study the time taken to transact commercial real estate assets in the UK using a sample of 578 transactions over the period 2004 to 2013. We measure average time to transact from a buyer and seller perspective, distinguishing the search and due diligence phases of the process, and we conduct econometric analysis to explain variation in due diligence times between assets. The median time for purchase of real estate from introduction to completion was 104 days and the median time for sale from marketing to completion was 135 days. There is considerable variation around these times and results suggest that some of this variation is related to market state, type and quality of asset, and type of participants involved in the transaction. Our findings shed light on the drivers of liquidity at an individual asset level and can inform models that quantify the impact of uncertain time on market on real estate investment risk.
Resumo:
Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.
Resumo:
In this paper, we construct a dynamic portrait of the inner asteroidal belt. We use information about the distribution of test particles, which were initially placed on a perfectly rectangular grid of initial conditions, after 4.2 Myr of gravitational interactions with the Sun and five planets, from Mars to Neptune. Using the spectral analysis method introduced by Michtchenko et al., the asteroidal behaviour is illustrated in detail on the dynamical, averaged and frequency maps. On the averaged and frequency maps, we superpose information on the proper elements and proper frequencies of real objects, extracted from the data base, AstDyS, constructed by Milani and Knezevic. A comparison of the maps with the distribution of real objects allows us to detect possible dynamical mechanisms acting in the domain under study; these mechanisms are related to mean-motion and secular resonances. We note that the two- and three-body mean-motion resonances and the secular resonances (strong linear and weaker non-linear) have an important role in the diffusive transportation of the objects. Their long-lasting action, overlaid with the Yarkovsky effect, may explain many observed features of the density, size and taxonomic distributions of the asteroids.
Resumo:
Current knowledge of the pathogenic hantavirus indicates that wild rodents are its primary natural reservoir. Specific primers to detect the presence of viral genomes were developed using an SYBR-Green-based real-time RT-PCR protocol. One hundred sixty-four rodents native to the Atlantic Forest biome were captured in So Paulo State, Brazil, and their tissues were tested. The presence of hantavirus RNA was detected in sixteen rodents: three specimens of Akodon montensis, three of Akodon cursor, two of Necromys lasiurus, one of Juliomys sp., one of Thaptomys nigrita, five of Oligoryzomys nigripes, and one of Oryzomys sp. This SYBR Green real-time RT-PCR method for detection of hantavirus may be useful for surveying hantaviruses in Brazil.
Resumo:
Study Objectives: Chronic sleep deprivation of rats causes hyperphagia without body weight gain. Sleep deprivation hyperphagia is prompted by changes in pathways governing food intake; hyperphagia may be adaptive to sleep deprivation hypermetabolism. A recent paper suggested that sleep deprivation might inhibit ability of rats to increase food intake and that hyperphagia may be an artifact of uncorrected chow spillage. To resolve this, a palatable liquid diet (Ensure) was used where spillage is insignificant. Design: Sleep deprivation of male Sprague Dawley rats was enforced for 10 days by the flowerpot/platform paradigm. Daily food intake and body weight were measured. On day 10, rats were transcardially perfused for analysis of hypothalamic mRNA expression of the orexigen, neuropeptide Y (NPY). Setting: Morgan State University, sleep deprivation and transcardial perfusion; University of Maryland, NPY in situ hybridization and analysis. Measurements and Results: Using a liquid diet for accurate daily measurements, there was no change in food intake in the first 5 days of sleep deprivation. Importantly, from days 6-10 it increased significantly, peaking at 29% above baseline. Control rats steadily gained weight but sleep-deprived rats did not. Hypothalamic NPY mRNA levels were positively correlated to stimulation of food intake and negatively correlated with changes in body weight. Conclusion: Sleep deprivation hyperphagia may not be apparent over the short term (i.e., <= 5 days), but when extended beyond 6 days, it is readily observed. The timing of changes in body weight and food intake suggests that the negative energy balance induced by sleep deprivation prompts the neural changes that evoke hyperphagia.
Resumo:
Attention is a critical mechanism for visual scene analysis. By means of attention, it is possible to break down the analysis of a complex scene to the analysis of its parts through a selection process. Empirical studies demonstrate that attentional selection is conducted on visual objects as a whole. We present a neurocomputational model of object-based selection in the framework of oscillatory correlation. By segmenting an input scene and integrating the segments with their conspicuity obtained from a saliency map, the model selects salient objects rather than salient locations. The proposed system is composed of three modules: a saliency map providing saliency values of image locations, image segmentation for breaking the input scene into a set of objects, and object selection which allows one of the objects of the scene to be selected at a time. This object selection system has been applied to real gray-level and color images and the simulation results show the effectiveness of the system. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Credit scoring modelling comprises one of the leading formal tools for supporting the granting of credit. Its core objective consists of the generation of a score by means of which potential clients can be listed in the order of the probability of default. A critical factor is whether a credit scoring model is accurate enough in order to provide correct classification of the client as a good or bad payer. In this context the concept of bootstraping aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the fitted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper we propose a new bagging-type variant procedure, which we call poly-bagging, consisting of combining predictors over a succession of resamplings. The study is derived by credit scoring modelling. The proposed poly-bagging procedure was applied to some different artificial datasets and to a real granting of credit dataset up to three successions of resamplings. We observed better classification accuracy for the two-bagged and the three-bagged models for all considered setups. These results lead to a strong indication that the poly-bagging approach may promote improvement on the modelling performance measures, while keeping a flexible and straightforward bagging-type structure easy to implement. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Drinking water utilities in urban areas are focused on finding smart solutions facing new challenges in their real-time operation because of limited water resources, intensive energy requirements, a growing population, a costly and ageing infrastructure, increasingly stringent regulations, and increased attention towards the environmental impact of water use. Such challenges force water managers to monitor and control not only water supply and distribution, but also consumer demand. This paper presents and discusses novel methodologies and procedures towards an integrated water resource management system based on advanced ICT technologies of automation and telecommunications for largely improving the efficiency of drinking water networks (DWN) in terms of water use, energy consumption, water loss minimization, and water quality guarantees. In particular, the paper addresses the first results of the European project EFFINET (FP7-ICT2011-8-318556) devoted to the monitoring and control of the DWN in Barcelona (Spain). Results are split in two levels according to different management objectives: (i) the monitoring level is concerned with all the aspects involved in the observation of the current state of a system and the detection/diagnosis of abnormal situations. It is achieved through sensors and communications technology, together with mathematical models; (ii) the control level is concerned with computing the best suitable and admissible control strategies for network actuators as to optimize a given set of operational goals related to the performance of the overall system. This level covers the network control (optimal management of water and energy) and the demand management (smart metering, efficient supply). The consideration of the Barcelona DWN as the case study will allow to prove the general applicability of the proposed integrated ICT solutions and their effectiveness in the management of DWN, with considerable savings of electricity costs and reduced water loss while ensuring the high European standards of water quality to citizens.
Resumo:
Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.
Resumo:
We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.
Resumo:
This paper examines the real convergence hypothesis across Brazilian states. In order to test for the existence of income convergence the or- der of integration of real Gross State Product (GSP) per capita series is examined as well as their di¤erences with respect to the São Paulo state which is used as a benchmark state. Both parametric and semiparametric methods are used and the results show that convergence is achieved in the cases of Alagoas, Amazonas, Bahia, Goiás, Mato Grosso, Minas Gerais, Pernambuco, Piauí, Rio Grande do Sul, Rio de Janeiro and Santa Cata- rina and convergence is weakly achieved in the cases of Ceará, Maranhao, Pará, Paraná and Sergipe .The states of Espírito Santo, Paraíba and Rio Grande do Norte show no convergence. O artigo examina a hipótese de convergência real entre os estados brasileiros. Para testar a existência ou não da convergência da renda a ordem da integração da série do produto real bruto do estado per capita é examinada assim como suas diferenças com respeito ao estado de São Paulo que é usado como base. Foram utilizados métodos paramétricos e semiparametric e os resultados mostram que ocorre convergência nos estados: Alagoas, Amazonas, Baía, Goiás, Mato Grosso, Minas Gerais, Pernambuco, Piauí, Rio Grande do Sul, Rio de Janeiro e Santa Catarina e ocorre convergência fraca nos estados: Ceará, de Maranhão, Pará, Paraná e Sergipe. Nos estado
Resumo:
The aim of this paper is to assess the progress of the banking sector before and shortly after the Real Plan. We began by assessing the drop in the inflation revenues (negative real interest rates paid by the excess of demand deposits over total reserve requirements) as a result of the change in inflation from 40% a month for the pre-Real Plan period to a monthly average of 3.65% (IGP-DI), between July 1994 and May 1995. Then, using the financial statement data of a group of 90 banks, we attempt to estimate the net losses due to the inflation drop analyzing the profitability and other parameters of the banking industry. The calculations are made separately for private, state and federal banks. A later analysis on performance using information given to CVM (Securities Exchange Commission) by the six major private banks in the country is also discussed herein.
Resumo:
The Real Plan has succeeded in stabilizing the Brazilian inflation. The consumer price inflation has been reduced from 11260 percent per year, in June 1994, to an estimate of 8 percent in 1997. The lower inflation resulted in a remarkable income distribution, and in an increased private consumption. The plan managed to control the inflationary effects of the increased demand with some traditional measures: A more liberalized economy, a moving (and overvalued) exchange rate band, high interest rate differentials, and a tight domestic credit policy. The government has, so far failed to accomplish the fiscal adjustment. The price stabilization has largely depended on the current account deficit. However, macroeconomic indicators do not present reasons for concern about the current account sustainability, in the medium-run. The economy may be trapped in a low-growth vicious cycle, represented by a stop-and-go trend, due to the two-way endogencity between domestic saving and growth. Economic growth depends on policies in increase the public sector saving, to secure the privatization of the State enterprises, and to promote investments. The major problem for the government action is, as always, in the political sphere. Approximately 80 percent of the Central Government net revenue are allocated to the social sectors. Consequently, the fiscal reform will hue to deal with the problem of re-designing the public sector’s intervention in the social area. Most probably, it will be inevitable to cut the social area budget. This is politically unpleasant.