677 resultados para Measurable Multifunctions
Resumo:
Tudomány nem létezik a tények felmérése, adatok gyűjtése és felhasználása nélkül. A tényeket azonban el lehet hallgatni vagy ferdíteni, az adatokat sokféleképpen lehet összeválogatni, az azokból készült mutatószámokat pedig a bonyolult és változó valóság leegyszerűsítő, sőt meghamisító ábrázolására, illetve magyarázatára is fel lehet használni. _____ Economics cannot do without measuring. However, the required data are not always available or they are not reliable, as some cases of population census exemplify it. The indicators we use, particularly composite indexes, are often misleading because they oversimplify complex phenomena or processes, and neglect important non-measurable ones, as the examples of the per capita GDP indicator measuring the development level of countries, and the composite indexes measuring the “human development” of countries (HDI), or their “national competitiveness” (GCI) may show. To avoid the enchantment of numbers, the quantitative approach must always be combined and corrected by a critical, holistic and qualitative approach.
Resumo:
Meier (2012) gave a "mathematical logic foundation" of the purely measurable universal type space (Heifetz and Samet, 1998). The mathematical logic foundation, however, discloses an inconsistency in the type space literature: a finitary language is used for the belief hierarchies and an infinitary language is used for the beliefs. In this paper we propose an epistemic model to fix the inconsistency above. We show that in this new model the universal knowledgebelief space exists, is complete and encompasses all belief hierarchies. Moreover, by examples we demonstrate that in this model the players can agree to disagree Aumann (1976)'s result does not hold, and Aumann and Brandenburger (1995)'s conditions are not sufficient for Nash equilibrium. However, we show that if we substitute selfevidence (Osborne and Rubinstein, 1994) for common knowledge, then we get at that both Aumann (1976)'s and Aumann and Brandenburger (1995)'s results hold.
Resumo:
A környezeti szempontok figyelembe vétele egyre gyakoribb mind a szakirodalomban, mind a vállalati gyakorlatban. Ezt mutatja az is, hogy egyre növekszik a zöld szempontokat feldolgozó tanulmányok és szakcikkek száma. Emellett a kutatók egyre több környezeti kritériumot magukba foglaló, összetett módszertant dolgoznak ki az optimális beszállító kiválasztásához. A tanulmány célja, hogy bemutassa, illetve rendszerezze a zöld szempontokat a beszállítóértékelésben, illetve rámutasson arra, hogy mekkora eszköztár áll már most rendelkezésre a vállalatok részére, amennyiben nem csak hagyományos kritériumokat kívánnak felhasználni a beszállítóik értékelésekor. Foglalkozik azzal, hogy melyek azok a fő motivációk, amelyek miatt érdemes a vállalatoknak zöld szempontokat integrálniuk a beszállítóértékelő rendszerükbe. A kutatás alapján az derült ki, hogy nem csak a törvényi előírások a fő mozgatórugók a vállalatoknál, hogy beszállítóikat környezeti szempontból is mérjék. Ugyanakkor egyelőre a vállalatok leginkább a környezeti menedzsment rendszer meglétét vizsgálják a beszállítóiknál és kevés egyéb a szakirodalomban már megjelent zöld szempontot vesznek figyelembe. Ugyanez vonatkozik a módszertanra is, hiszen a vállalati gyakorlatból az derült ki, hogy kevésbé használják a szakirodalomban kidolgozott, összetett módszereket, hanem sokkal inkább a könnyen mérhető, kevesebb szempontot magukba foglaló eszközöket alkalmazzák. _______ Environmental criteria became more and more prevalent in the past not only in the literature but also in the companies practice. This is shown by the growing numbers of articles about green criteria. Alongside this, researchers are creating more and more methodologies for the selection of suppliers which contains environmental criteria. The purpose of this paper is to present and structure green criteria and to point out what a great selection of methodologies are available for the companies if they want to use not only the traditional criteria but environmentals too. Besides, in this research I present the most common motivations which can cause the introduction of green criteria in supplier evaluation. It was found that not the governmental requirements are the only motivations for companies. However, for the present, companies use mostly for green criteria the environmental management system if it is introduced at their suppliers or not and do not consider more, altough they are available in the literature. The same statement is appertain to the methodologies because it was found that companies rather than using the complex, elaborated ones, they search for the easily measurable methodologies which contains less criteria.
Resumo:
The spectral quality of radiation in the understory of two neotropical rainforests, Barro Colorado Island in Panama and La Selva in Costa Rica, is profoundly affected by the density of the canopy. Understory light conditions in both forests bear similar spectral characteristics. In both the greatest changes in spectral quality occur at low flux densities, as in the transition from extreme shade to small light flecks. Change in spectral quality, as assessed by the red: far-red (R:FR) ratio, the ratio of radiant energy 400-700: 300-1100 nm, and the ratio of quantum flux density 400-700:300-1100 nm, is strongly correlated with a drop in percentage of solar radiation as measurable by a quantum radiometer. Thus, by knowing the percentage of photosynthetic photon flux density (PPFD) in relation to full sunlight, it is possible to estimate the spectral quality in the forest at a particular time and microsite.
Resumo:
The purpose of this study was threefold. The primary purpose was to develop a stress profile for teachers in private schools. This study also addressed two exploratory issues. The first, consisted of an examination of the possible differences in the levels of on-the-job stress among teachers in different types of private schools. A second issue was to discuss the findings on private school in light of the extant literature on public schools, specifically using the data collected by Fimain to develop the Teacher Stress Inventory. This study was conducted utilizing 316 full time teachers from seven schools from six different states.^ The instrument employed in this study was the Teacher Stress Inventory (TSI) developed by Fimian (1988). The TSI is a 10 factor, 49 item self-report measure. The 10 factors consist of five Stress Sources and five Stress measure. The 10 factors consist of five Stress Sources and five Stress Manifestations subscales. The mean for these 10 factors yields the stress construct termed "Total Stress." Of the 437 surveys mailed, 316 usable surveys, i.e., 72.3%, were returned.^ The results suggest that private school teachers experience moderate levels of stress. The mean score was 2.27 indicating a lower than average stress level as measured by the TSI. Comparisons between types of private schools revealed that there were no significant differences between the stress levels of teachers in boarding and nonboarding schools. Teachers in large schools experience significantly higher levels of stress than teachers in small and medium size schools. However, the measurable difference between them translates into a very small difference in terms of the real stress levels of these teachers in their professional lives. A significant difference was also found between the stress levels of public $(M=2.60)$ and private school teachers $(M=2.27).$ Both means fall within the moderate range, however, while private school teachers experience lower than average levels of stress, the stress levels of teachers in public schools falls in the higher than average range.^ Recommendations for reducing stress levels in both private and public schools are presented as well as suggestions for future research. ^
Resumo:
The purpose of this research was to study the effect of the Florida A+ Plan accountability program on curriculum and instruction in four Title I public elementary schools in the Miami-Dade County Public Schools system. It focused on the experiences of the school principals and the classroom teachers of the four schools as they related to curriculum and instruction. The study included an analysis of the school improvement plans in curriculum and instruction for each school during the school years 1998-2004. ^ The study was conducted in the format of interviews with the school principals and principal selected classroom teachers who taught third, fourth, or fifth grade during the first six years of the Florida A+ Plan. The analysis of the school improvement plans focused on the implementation of curriculum and instruction for each of the four schools. It focused on the goals and measurable objectives selected by each school to improve its instructional program in the academic subjects of reading, mathematics, writing, and science. ^ The findings indicated that under the pressure to improve their school grade on the Florida A+ Plan, each of the target schools, based on individual needs assessments, and restructured their instructional program each school year as documented in their school improvement plans. They altered their programs by analyzing student performance data to realign curriculum and instruction. The analysis of the interviews with the principals and the teachers showed that each school year they restructured their program to align it with the FCAT content. This realigning was a collaborative effort on the part of the administration and the instructional staff. ^
Resumo:
Although drug trafficking organizations (DTOs) exist and have an effect on health, crime, economies, and politics, little research has explored these entities as political organizations. Legal interest groups and movements have been found to influence domestic and international politics because they operate within legal parameters. Illicit groups, such as DTOs, have rarely been accounted for—especially in the literature on interest groups—though they play a measurable role in affecting domestic and international politics in similar ways. Using an interest group model, this dissertation analyzed DTOs as illicit interest groups (IIGs) to explain their political influence. The analysis included a study of group formation, development, and demise that examined IIG motivation, organization, and policy impact. The data for the study drew from primary and secondary sources, which include interviews with former DTO members and government officials, government documents, journalistic accounts, memoirs, and academic research. To illustrate the interest group model, the study examined Medellin-based DTO leaders, popularly known as the "Medellin Cartel." In particular, the study focused on the external factors that gave rise to DTOs in Colombia and how Medellin DTOs reacted to the implementation of counternarcotics efforts. The discussion was framed by the implementation of the 1979 Extradition Treaty negotiated between Colombia and the United States. The treaty was significant because as drug trafficking became the principal bilateral issue in the 1980s; extradition became a major method of combating the illicit drug business. The study's findings suggested that Medellin DTO leaders had a one-issue agenda and used a variety of political strategies to influence public opinion and all three branches of government—the judicial, the legislative, and the executive—in an effort to invalidate the 1979 Extradition Treaty. The changes in the life cycle of the 1979 Extradition Treaty correlated with changes in the political power of Medellin-based DTOs vis-à-vis the Colombian government, and international forces such as the U.S. government's push for tougher counternarcotics efforts.
Resumo:
In their discussion entitled - “Unfair” Restaurant Reviews: To Sue Or Not To Sue - by John Schroeder and Bruce Lazarus, Assistant Professors, Department of Restaurant, Hotel and Institutional Management at Purdue University, the authors initially state: “Both advantages and disadvantages exist on bringing lawsuits against restaurant critics who write “unfair” reviews. The authors, both of whom have experience with restaurant criticism, offer practical advice on what realistically can be done by the restaurateur outside of the courtroom to combat unfair criticism.” Well, this is going to be a sticky wicket no matter how you try to defend it, reviews being what they are; very subjective pieces of opinionated journalism, especially in the food industry. And, of course, unless you can prove malicious intent there really is no a basis for a libel suit. So, a restaurateur is at the mercy of written opinion and the press. “Libel is the written or published form of slander which is the statement of false remarks that may damage the reputation of others. It also includes any false and malicious publication which may damage a person's business, trade, or employment,” is the defined form of the law provided by the authors. Anecdotally, Schroeder and Lazarus offer a few of the more scathing pieces reviewers have written about particular eating establishments. And, yes, they can be a bit comical, unless you are the owner of an establishment that appears in the crosshairs of such a reviewer. A bad review can kneecap even a popular eatery. “Because of the large readership of restaurant reviews in the publication (consumer dining out habits indicate that nearly 50 percent of consumers read a review before visiting a new restaurant) your business begins a very dangerous downward tailspin,” the authors reveal, with attribution. “Many restaurant operators contend that a bad review can cost them an immediate trade loss of upward of 50 percent,” Schroeder and Lazarus warn. “The United States Supreme Court has ruled that a restaurant owner can collect damages only if he proves that the statement or statements were made with “actual malice,” even if the statements were untrue,” the authors say by way of citation. And that last portion of the statement cannot be over-emphasized. The first amendment to the U.S. Constitution does wield a heavy hammer, indeed, and it should. So, what recourse does a restaurateur have? The authors cautiously give a guarded thumbs-up to a lawsuit, but you better be prepared to prove a misstatement of fact, as opposed to the distinguishable press protected right of opinion. For the restaurateur the pitfalls are many, the rewards few and far between, Schroeder and Lazarus will have you know. “…after weighing the advantages and disadvantages of a lawsuit against a critic...the disadvantages are overwhelming,” the authors say. “Chicago restaurant critic James Ward said that someone dumped a load of manure on his yard accompanied by a note that read - Stop writing that s--t! - after he wrote a review of a local restaurant.” Such is a novel if not legally measurable tack against an un-mutual review.
Resumo:
In his essay - Toward a Better Understanding of the Evolution of Hotel Development: A Discussion of Product-Specific Lodging Demand - by John A. Carnella, Consultant, Laventhol & Horwath, cpas, New York, Carnella initially describes his piece by stating: “The diversified hotel product in the united states lodging market has Resulted in latent room-night demand, or supply-driven demand resulting from the introduction of a lodging product which caters to a specific set of hotel patrons. The subject has become significant as the lodging market has moved toward segmentation with regard to guest room offerings. The author proposes that latent demand is a tangible, measurable phenomenon best understood in light of the history of the guest room product from its infancy to its present state.” The article opens with an ephemeral depiction of hotel development in the United States, both pre’ and post World War II. To put it succinctly, the author wants you to know that the advent of the inter-state highway system changed the complexion of the hotel industry in the U.S. “Two essential ingredients were necessary for the next phase of hotel development in this country. First was the establishment of the magnificently intricate infrastructure which facilitated motor vehicle transportation in and around the then 48 states of the nation,” says Carnella. “The second event…was the introduction of affordable highway travel. Carnella goes on to say that the next – big thing – in hotel evolution was the introduction of affordable air travel. “With the airways filled with potential lodging guests, developers moved next to erect a new genre of hotel, the airport hotel,” Carnella advances his picture. Growth progressed with the arrival of the suburban hotel concept, which wasn’t fueled by developments in transportation, but by changes in people’s living habits, i.e. suburban affiliations as opposed to urban and city population aggregates. The author explores the distinctions between full-service and limited service lodging operations. “The market of interest with consideration to the extended-stay facility is one dominated by corporate office parks,” Carnella proceeds. These evolutional states speak to latent demand, and even further to segmentation of the market. “Latent demand… is a product-generated phenomenon in which the number of potential hotel guests increases as the direct result of the introduction of a new lodging facility,” Carnella brings his unique insight to the table with regard to the specialization process. The demand is already there; just waiting to be tapped. In closing, “…there must be a consideration of the unique attributes of a lodging facility relative to its ability to attract guests to a subject market, just as there must be an examination of the property's ability to draw guests from within the subject market,” Carnella proposes.
Resumo:
The economic development of any region involves some consequences to the environment. The choice of a socially optimal development plan must consider a measure of the strategy's environmental impact. This dissertation tackles this problem by examining environmental impacts of new production activities. The study uses the experience of the Carajás region in the north of Brazil. This region, which prior to the 1960's was an isolated outpost of the Amazon area, was integrated to the rest of the country with a non-sophisticated but strategic road system and eventually became the second largest iron ore mining area in the world. Finally, in the 1980's, the area was linked, by way of a railroad, to the nearest seaport along the Atlantic Ocean. The consequence of such changes was a burst of economic growth along the railroad Corridor and neighboring areas. In this work, a Social Accounting Matrix (SAM) is used to construct a 2-region (Corridor and surrounding area), fixed price, Computable General Equilibrium (CGE) Model to examine the relationship between production and pollution by measuring the different pollution effects of alternative growth strategies. SAMs are a very useful tool to examine the environmental impacts of development by linking production activities to measurable indices of natural resource degradation. The simulation results suggest that the strategies leading to faster economic growth in the short run are also those that lead to faster rates of environmental degradation. The simulations also show that the strategies that leads to faster rates of short run growth do so at the price of a rate of environmental depletion that is unsustainable from a long run perspective. These results, therefore, support the concern expressed by environmental economists and policy makers regarding the possible trade-offs between economic growth and environmental preservation. This stresses the need for a careful analysis of the environmental impacts of alternative growth strategies. ^
Resumo:
The two-photon exchange phenomenon is believed to be responsible for the discrepancy observed between the ratio of proton electric and magnetic form factors, measured by the Rosenbluth and polarization transfer methods. This disagreement is about a factor of three at Q 2 of 5.6 GeV2. The precise knowledge of the proton form factors is of critical importance in understanding the structure of this nucleon. The theoretical models that estimate the size of the two-photon exchange (TPE) radiative correction are poorly constrained. This factor was found to be directly measurable by taking the ratio of the electron-proton and positron-proton elastic scattering cross sections, as the TPE effect changes sign with respect to the charge of the incident particle. A test run of a modified beamline has been conducted with the CEBAF Large Acceptance Spectrometer (CLAS) at Thomas Jefferson National Accelerator Facility. This test run demonstrated the feasibility of producing a mixed electron/positron beam of good quality. Extensive simulations performed prior to the run were used to reduce the background rate that limits the production luminosity. A 3.3 GeV primary electron beam was used that resulted in an average secondary lepton beam of 1 GeV. As a result, the elastic scattering data of both lepton types were obtained at scattering angles up to 40 degrees for Q2 up to 1.5 GeV2. The cross section ratio displayed an &epsis; dependence that was Q2 dependent at smaller Q2 limits. The magnitude of the average ratio as a function of &epsis; was consistent with the previous measurements, and the elastic (Blunden) model to within the experimental uncertainties. Ultimately, higher luminosity is needed to extend the data range to lower &epsis; where the TPE effect is predicted to be largest.
Resumo:
Estuarine productivity is highly dependent on the freshwater sources of the estuary. In Florida Bay, Taylor Slough was historically the main source of fresh water. Beginning in about 1960, and culminating with the completion of the South Dade Conveyance System in 1984, water management practice began to change the quantity and distribution of flow from Taylor Slough into Northeastern Florida Bay. These practices altered salinity and hydrologic parameters that had measurable negative impacts on vertebrate fauna and their habitats. Here, I review those impacts from published and unpublished literature and anecdotal observations. Almost all vertebrates covered in this review have shown some form of population decline since 1984; most of the studies implicate declines in food resources as the main stressor on their populations. My conclusion is that the diversion of fresh water resulted in an ecological cascade starting with hydrologic stresses on primary then secondary producers culminating in population declines at the top of the food web.
Resumo:
Awareness of extreme high tide flooding in coastal communities has been increasing in recent years, reflecting growing concern over accelerated sea level rise. As a low-lying, urban coastal community with high value real estate, Miami often tops the rankings of cities worldwide in terms of vulnerability to sea level rise. Understanding perceptions of these changes and how communities are dealing with the impacts reveals much about vulnerability to climate change and the challenges of adaptation. ^ This empirical study uses an innovative mixed-methods approach that combines ethnographic observations of high tide flooding, qualitative interviews and analysis of tidal data to reveal coping strategies used by residents and businesses as well as perceptions of sea level rise and climate change, and to assess the relationship between measurable sea levels and perceptions of flooding. I conduct a case study of Miami Beach's storm water master planning process which included sea level rise projections, one of the first in the nation to do so, that reveals the different and sometimes competing logics of planners, public officials, activists, residents and business interests with regards to climate change adaptation. By taking a deeply contextual account of hazards and adaptation efforts in a local area I demonstrate how this approach can be effective at shedding light on some of the challenges posed by anthropogenic climate change and accelerated rates of sea level rise. ^ The findings highlight challenges for infrastructure planning in low-lying, urban coastal areas, and for individual risk assessment in the context of rapidly evolving discourse about the threat of sea level rise. Recognition of the trade-offs and limits of incremental adaptation strategies point to transformative approaches, at the same time highlighting equity concerns in adaptation governance and planning. This new impact assessment method contributes to the integration of social and physical science approaches to climate change, resulting in improved understanding of socio-ecological vulnerability to environmental change.^
Resumo:
The contamination of aquatic environments is a phenomenon that dates back the origins of human civilizations and was amplified by the advent of industrial processes. The Jundiaí river , Macaíba's main water source, suffering discharge of effluents from various industries. The study work´s in two fronts, the environmental perception front was conducted through semistructured interviews whose textile effluent was appointed by the population as the main problem in the river. It was observed that nearly all respondents had concerns about the environment. In addition, there is an inclusion of individuals as the cause of the problem, because a significant part recognizes that its activities may cause damage to the environment and people's health. In other front, the experimental monitoring of water quality was conducted through ecotoxicological tests and physiochemical analysis that proposed to assess Pomacea lineata .Mysidopsis juniae isolated effect of textile effluent and its influence on the river compared with the limits established by Brazilian law. Although the physio-chemical analysis shows is inconclusive about the participation of the textile effluent in environmental contamination of the river, the ecotoxicological tests have shown to blunt the signal that the effluent may present a risk to aquatic organisms and consequently to human health. Thus, an interdisciplinary way it was possible to study the cause of the environmental problem identified by the population in the realization phase and measurable effect on water quality analysis in the river by means of the tests mentioned.
Resumo:
Increasingly, the Information Technology (IT) has been used to sustain the business strategies, causing increased its relevance. Therefore IT governance is seen as one of the priorities of organizations at the time. The search for strategic alignment between business and IT is debated as a factor for business success, but even with that importance, usually the main business managers are reluctant to take responsibility for decisions involving IT, mainly due to the complexity of your infrastructure. Since cloud computing is being seen as an element capable of assisting in the implementation of organizational strategies, because their characteristics enable greater efficiency and agility in IT, and is considered as a new computing paradigm. The main objective of the analyze the relationship between IT governance arrangements and strategic alignment with the infrastructure as a service (IaaS) of public cloud computing. Therefore, an exploratory, descriptive and inferential was developed, with approach to the problem of quantitatively research, with descriptive survey method and cross section. An electronic questionnaire that was applied to the ISACA chapters Associates of São Paulo and the Distrito Federal, totaling 164 respondents was used. The instrument used based on the theories of Weill and Ross (2006) for array of IT governance arrangement; Henderson and Venkatraman (1993) and Luftman (2000), for maturity of the strategic alignment model; and NIST (2011 b), ITGI (2007) and CSA (2010) for infrastructure maturity as a service (IaaS) public in its essential characteristics. As regards the main results, this research proved that with public IaaS decision-making structures have changed, with a greater participation of senior executives in all five key IT decisions (IT governance arrangement array) including more technical decisions as architecture and IT infrastructure. With increased participation of senior executives the decrease was also observed in the share of IT specialists, characterizing the decision process with the duopoly archetype (shared decision). With regard to strategic alignment, it was observed that it changes with cloud computing, and organizations with public IaaS, a maturity of strategic alignment with statistically significant and greater difference when compared to organizations without IaaS. The maturity of public IaaS is at the intermediate level (level 3 - "defined process"), with the elasticity and measurement achieved level 4 - "managed and measurable" It was also possible to infer in organizations with public IaaS, there are positive correlations between the key decisions and the maturity of IaaS, especially at the beginning, architecture and infrastructure, and the archetypes involving senior executives and IT specialists. In the correlation between the maturity and mature strategic alignment of public IaaS therefore the higher the strategic alignment, the greater the maturity of the public IaaS and vice versa.