842 resultados para Real assets and portfolio diversification


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is consider the new global models for society of neuronet type. The hierarchical structure of society and mentality of individual are considered. The way for incorporating in model anticipatory (prognostic) ability of individual is considered. Some implementations of approach for real task and further research problems are described. Multivaluedness of models and solutions is discussed. Sensory-motor systems analogy also is discussed. New problems for theory and applications of neural networks are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Let p(z) be an algebraic polynomial of degree n ¸ 2 with real coefficients and p(i) = p(¡i). According to Grace-Heawood Theorem, at least one zero of the derivative p0(z) is on the disk with center in the origin and radius cot(¼=n). In this paper is found the smallest domain containing at leas one zero of the derivative p0(z).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a different vision for personalization of the user’s stay in a cultural heritage digital library that models services for personalized content marking, commenting and analyzing that doesn’t require strict user profile, but aims at adjusting the user’s individual needs. The solution is borrowed from real work and studying of traditional written content sources (incl. books, manuals), where the user mainly performs activities such as underlining the important parts of the content, writing notes and inferences, selecting and marking zones of their interest in pictures, etc. In the paper a special attention is paid to the ability to execute learning analysis allowing different ways for the user to experience the digital library content with more creative settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The multi-polar world in which we now live and work demands re-examination and refinement of the traditional understanding of the internationalization strategies and competitive advantages of multinational firms by incorporating the characteristics of firms from emerging economies. Based on interviews in four Indian multinationals in different industry segments, we present the "voices" of Indian corporate leaders to provide preliminary evidence on the primary motives behind the internationalization process of emerging multinationals from the perspective of linkage, leverage and learning (LLL). We show how the case study organizations have evolved themselves to become credible global players by leveraging on their learning through targeted acquisitions in developed markets to acquire intangible assets and/or following global clients in search of new markets and competitive advantages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

'Takes the challenging and makes it understandable. The book contains useful advice on the application of statistics to a variety of contexts and shows how statistics can be used by managers in their work.' - Dr Terri Byers, Assistant Professor, University Of New Brunswick, Canada A book about introductory quantitative analysis for business students designed to be read by first- and second-year students on a business studies degree course that assumes little or no background in mathematics or statistics. Based on extensive knowledge and experience in how people learn and in particular how people learn mathematics, the authors show both how and why quantitative analysis is useful in the context of business and management studies, encouraging readers to not only memorise the content but to apply learning to typical problems. Fully up-to-date with comprehensive coverage of IBM SPSS and Microsoft Excel software, the tailored examples illustrate how the programmes can be used, and include step-by-step figures and tables throughout. A range of ‘real world’ and fictional examples, including "The Ballad of Eddie the Easily Distracted" and "Esha's Story" help bring the study of statistics alive. A number of in-text boxouts can be found throughout the book aimed at readers at varying levels of study and understanding •Back to Basics for those struggling to understand, explain concepts in the most basic way possible - often relating to interesting or humorous examples •Above and Beyond for those racing ahead and who want to be introduced to more interesting or advanced concepts that are a little bit outside of what they may need to know •Think it over get students to stop, engage and reflect upon the different connections between topics A range of online resources including a set of data files and templates for the reader following in-text examples, downloadable worksheets and instructor materials, answers to in-text exercises and video content compliment the book.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are very few research studies on macroeconomic inventory behaviour of various countries. It is clear that macro inventories are the results of a large number of individual microdecisions. However, we believe that it is worth analysing how inventories develop in the individual countries and why we can see different tendencies. This paper is the newest piece in a series of studies on the above subject. We use the OECD database to analyse inventory trends between 1987 and 2004 in nine of the most developed economies of the world. Annual inventory investment data are used and their connections with other components of GDP expenditure (governmental and private consumption, investment in fixed assets and foreign trade balance as well as the annual growth rate of GDP) are examined by multi-variable statistical analysis. Conclusions include the steadily decreasing tendency of inventory fluctuations, the varying periods of higher and lower rates of inventory investments and the differences of main influencing factors by country.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Az EU-ban, a mai állapotok szerint, csak „transzferunióról” beszélhetünk és nem egységes piacról. Az eurós pénzfolyamatok eltorzítva közvetítik a versenyképességet is: mind az árukban és vagyontárgyakban, mind – főleg – a pénzügyi eszközökben megtestesült munkákat/teljesítményeket rosszul árazzák. Egy ilyen keretben különösen könnyen alakul ki az, amit potyautas-problémának nevezünk, vagyis ahol tényleges vagy mérhető teljesítményleadás, vagy éppen fizetés nélkül lehet fogyasztani, és túl olcsón lehet szabad forrásokhoz jutni. Az eurózóna számos közvetítő mechanizmusában is tökéletlen. A sok, szuverénadósság-présbe került tagország között van kicsi, közepes és nagy is. Ez a tény, valamint az általános növekedési és munkapiaci problémák, egyértelműen „rendszerszintű zavarokat” jeleznek, amelyeket ebben a dolgozatban teljesítmény közvetítési-átviteli problémának hívunk, és ezért egy szokatlan, ám annál beszédesebb, elektromosenergia-átviteli rendszeranalógiával segítünk értelmezni. Megmutatjuk, hogy egy jó nagyvállalat miért jobb pénzügyi tervező, mint egy azonos méretű állam. _____ Why are ill-defined transfer mechanisms diverting valuable assets and resources to the wrong destination within the EU? Why do we witness ongoing pressure in the EU banking sector and in government finances? We offer an unusual answer to these questions: we apply an analogy from physics (from an electric generation and distribution network) to show the transmission inefficiency and waste, respectively, of the EU distribution mechanisms. We demonstrate that there are inherent flaws in both the measurement and in the distribution of assets and resources amongst the key EU markets: goods, money and factor markets. In addition, we find that when international equalizer mechanism is at work (cohesion funds allocated), many of these equity functions are at risk with respect to their reliable measurement. Especially are at risk the metered load factors, likewise the loss/waste factors. The map of desired outcomes does not match the real outcome, since EUtransfers in general are put to work with low efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 1972 the ionized cluster beam (ICB) deposition technique was introduced as a new method for thin film deposition. At that time the use of clusters was postulated to be able to enhance film nucleation and adatom surface mobility, resulting in high quality films. Although a few researchers reported singly ionized clusters containing 10$\sp2$-10$\sp3$ atoms, others were unable to repeat their work. The consensus now is that film effects in the early investigations were due to self-ion bombardment rather than clusters. Subsequently in recent work (early 1992) synthesis of large clusters of zinc without the use of a carrier gas was demonstrated by Gspann and repeated in our laboratory. Clusters resulted from very significant changes in two source parameters. Crucible pressure was increased from the earlier 2 Torr to several thousand Torr and a converging-diverging nozzle 18 mm long and 0.4 mm in diameter at the throat was used in place of the 1 mm x 1 mm nozzle used in the early work. While this is practical for zinc and other high vapor pressure materials it remains impractical for many materials of industrial interest such as gold, silver, and aluminum. The work presented here describes results using gold and silver at pressures of around 1 and 50 Torr in order to study the effect of the pressure and nozzle shape. Significant numbers of large clusters were not detected. Deposited films were studied by atomic force microscopy (AFM) for roughness analysis, and X-ray diffraction.^ Nanometer size islands of zinc deposited on flat silicon substrates by ICB were also studied by atomic force microscopy and the number of atoms/cm$\sp2$ was calculated and compared to data from Rutherford backscattering spectrometry (RBS). To improve the agreement between data from AFM and RBS, convolution and deconvolution algorithms were implemented to study and simulate the interaction between tip and sample in atomic force microscopy. The deconvolution algorithm takes into account the physical volume occupied by the tip resulting in an image that is a more accurate representation of the surface.^ One method increasingly used to study the deposited films both during the growth process and following, is ellipsometry. Ellipsometry is a surface analytical technique used to determine the optical properties and thickness of thin films. In situ measurements can be made through the windows of a deposition chamber. A method for determining the optical properties of a film, that is sensitive only to the growing film and accommodates underlying interfacial layers, multiple unknown underlayers, and other unknown substrates was developed. This method is carried out by making an initial ellipsometry measurement well past the real interface and by defining a virtual interface in the vicinity of this measurement. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A comprehensive investigation of sensitive ecosystems in South Florida with the main goal of determining the identity, spatial distribution, and sources of both organic biocides and trace elements in different environmental compartments is reported. This study presents the development and validation of a fractionation and isolation method of twelve polar acidic herbicides commonly applied in the vicinity of the study areas, including e.g. 2,4-D, MCPA, dichlorprop, mecroprop, picloram in surface water. Solid phase extraction (SPE) was used to isolate the analytes from abiotic matrices containing large amounts of dissolved organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-) in a Quadrupole Ion Trap mass spectrometer was used to perform the characterization of the herbicides of interest. ^ The application of Laser Ablation-ICP-MS methodology in the analysis of soils and sediments is reported in this study. The analytical performance of the method was evaluated on certified standards and real soil and sediment samples. Residential soils were analyzed to evaluate feasibility of using the powerful technique as a routine and rapid method to monitor potential contaminated sites. Forty eight sediments were also collected from semi pristine areas in South Florida to conduct screening of baseline levels of bioavailable elements in support of risk evaluation. The LA-ICP-MS data were used to perform a statistical evaluation of the elemental composition as a tool for environmental forensics. ^ A LA-ICP-MS protocol was also developed and optimized for the elemental analysis of a wide range of elements in polymeric filters containing atmospheric dust. A quantitative strategy based on internal and external standards allowed for a rapid determination of airborne trace elements in filters containing both contemporary African dust and local dust emissions. These distributions were used to qualitative and quantitative assess differences of composition and to establish provenance and fluxes to protected regional ecosystems such as coral reefs and national parks. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is based on the premises that teams can be designed to optimize its performance, and appropriate team coordination is a significant factor to team outcome performance. Contingency theory argues that the effectiveness of a team depends on the right fit of the team design factors to the particular job at hand. Therefore, organizations need computational tools capable of predict the performance of different configurations of teams. This research created an agent-based model of teams called the Team Coordination Model (TCM). The TCM estimates the coordination load and performance of a team, based on its composition, coordination mechanisms, and job’s structural characteristics. The TCM can be used to determine the team’s design characteristics that most likely lead the team to achieve optimal performance. The TCM is implemented as an agent-based discrete-event simulation application built using JAVA and Cybele Pro agent architecture. The model implements the effect of individual team design factors on team processes, but the resulting performance emerges from the behavior of the agents. These team member agents use decision making, and explicit and implicit mechanisms to coordinate the job. The model validation included the comparison of the TCM’s results with statistics from a real team and with the results predicted by the team performance literature. An illustrative 26-1 fractional factorial experimental design demonstrates the application of the simulation model to the design of a team. The results from the ANOVA analysis have been used to recommend the combination of levels of the experimental factors that optimize the completion time for a team that runs sailboats races. This research main contribution to the team modeling literature is a model capable of simulating teams working on complex job environments. The TCM implements a stochastic job structure model capable of capturing some of the complexity not capture by current models. In a stochastic job structure, the tasks required to complete the job change during the team execution of the job. This research proposed three new types of dependencies between tasks required to model a job as a stochastic structure. These dependencies are conditional sequential, single-conditional sequential, and the merge dependencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bio-molecular interactions exist ubiquitously in all biological systems. This dissertation project was to construct a powerful surface plasmon resonance (SPR) sensor. The SPR system is used to study bio-molecular interactions in real time and without labeling. Surface plasmon is the oscillation of free electrons in metals coupled with surface electromagnetic waves. These surface electromagnetic waves provide a sensitive probe to study bio-molecular interactions on metal surfaces. This project resulted in the successful construction and optimization of a homemade SPR sensor and the development of several new powerful protocols to study bio-molecular interactions. It was discovered through this project that the limitations of earlier SPR sensors are related not only to the instrumentation design and operating procedures, but also to the complex behaviors of bio-molecules on sensor surfaces that were very different from that in solution. Based on these discoveries the instrumentation design and operating procedures were fully optimized. A set of existing sensor surface treatment protocols were tested and evaluated and new protocols were developed in this project. The new protocols have demonstrated excellent performance to study biomolecular interactions. The optimized home-made SPR sensor was used to study protein-surface interactions. These protein-surface interactions are responsible for many complex organic cell activities. The co-existence of different driving forces and their correlation with the structure of the protein and the surface make the understanding of the fundamental mechanism of protein-surface interactions a very challenging task. Using the improved SPR sensor, the electrostatic interaction and hydrophobic interaction were studied separately. The results of this project directly confirmed the theoretical predictions for electrostatic force between the protein and surface. In addition, this project demonstrated that the strength of the protein-surface hydrophobic interaction does not solely depend on the hydrophobicity as reported earlier. Surface structure also plays a significant role.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today's networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. ^ First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. ^ Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. ^ Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bankruptcy prediction has been a fruitful area of research. Univariate analysis and discriminant analysis were the first methodologies used. While they perform relatively well at correctly classifying bankrupt and nonbankrupt firms, their predictive ability has come into question over time. Univariate analysis lacks the big picture that financial distress entails. Multivariate discriminant analysis requires stringent assumptions that are violated when dealing with accounting ratios and market variables. This has led to the use of more complex models such as neural networks. While the accuracy of the predictions has improved with the use of more technical models, there is still an important point missing. Accounting ratios are the usual discriminating variables used in bankruptcy prediction. However, accounting ratios are backward-looking variables. At best, they are a current snapshot of the firm. Market variables are forward-looking variables. They are determined by discounting future outcomes. Microstructure variables, such as the bid-ask spread, also contain important information. Insiders are privy to more information that the retail investor, so if any financial distress is looming, the insiders should know before the general public. Therefore, any model in bankruptcy prediction should include market and microstructure variables. That is the focus of this dissertation. The traditional models and the newer, more technical models were tested and compared to the previous literature by employing accounting ratios, market variables, and microstructure variables. Our findings suggest that the more technical models are preferable, and that a mix of accounting and market variables are best at correctly classifying and predicting bankrupt firms. Multi-layer perceptron appears to be the most accurate model following the results. The set of best discriminating variables includes price, standard deviation of price, the bid-ask spread, net income to sale, working capital to total assets, and current liabilities to total assets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research investigated the relationship between investments in fixed assets and free cash flows of U.S. restaurant firms while controlling for future investment opportunities and financial constraints. It also investigated investment and cash-flow sensitivity in the context of economic conditions. Results suggested that investments in small firms (with higher financial constraints) had relatively weaker sensitivity to cash flows than investments in large firms (with higher sensitivity). Controlling for economic conditions did not significantly change results. While the debate over sensitivity of investments to cash flows remains unresolved, it has not been explored widely in industry contexts, especially in services such as the restaurant industry. In addition to its contribution to this literature, this paper provides implications for cash-flow management in publicly traded restaurant companies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the discussion - Travel Marketing: Industry Relationships and Benefits - by Andrew Vladimir, Visiting Assistant Professor, School of Hospitality Management at Florida International University, the author initially states: “A symbiotic relationship exists among the various segments of the travel and tourism industry. The author has solicited the thinking of 37experts and leaders in the field in a book dealing with these relationships and how they can be developed to benefit the industry. This article provides some salient points from those contributors.” This article could be considered a primer on networking for the hospitality industry. It has everything to do with marketing and the relationships between varied systems in the field of travel and tourism. Vladimir points to instances of success and failure in marketing for the industry at large. And there are points of view from thirty-seven contributing sources here. “Miami Beach remains a fitting example of a leisure product that has been unable to get its act together,” Vladimir shares a view. “There are some first class hotels, a few good restaurants, alluring beaches, and a splendid convention center, but there is no synergism between them, no real affinity, and so while visitors admire the Fontainebleau Hilton and enjoy the food at Joe's Stone Crabs, the reputation of Miami Beach as a resort remains sullied,” the author makes a point. In describing cohesiveness between exclusive systems, Vladimir says, “If each system can get a better understanding of the inner workings of neighboring related systems, each will ultimately be more successful in achieving its goals.” The article is suggesting that exclusive systems aren’t really exclusive at all; or at least they shouldn’t be. In a word – competition – drives the market, and in order for a property to stay afloat, aggressive marketing integrated with all attendant resources is crucial. “Tisch [Preston Robert Tisch, currently – at the time of this writing - the Postmaster General of the United States and formerly president of Lowe’s Hotels and the New York Visitors and Convention Bureau], in talking about the need for aggressive marketing says: “Never...ever...take anything for granted. Never...not for a moment...think that any product or any place will survive strictly on its own merits.” Vladimir not only sources several knowledgeable representatives in the field of hospitality and tourism, but he also links elements as disparate as real estate, car rental, cruise and airlines, travel agencies and traveler profiles to illustrate his points on marketing integration. In closing, Vladimir quotes the Honorable Donna Tuttle, Undersecretary of Commerce for Travel and Tourism, “Uniting the components of this industry in an effective marketing coalition that can compete on an equal footing with often publicly-owned foreign tourism conglomerates and multi-national consortia must be a high priority as the United States struggles to maintain and expand its share of a rapidly changing global market.”