20 resultados para Digital information environment
em Digital Commons at Florida International University
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.
Resumo:
Dual-class stock structure is characterized by the separation of voting rights and cash flow rights. The departure from a common "one share-one vote" configuration creates ideal conditions for conflicts of interest and agency problems between controlling insiders (the holders of voting rights) and remaining shareholders. The owners of voting rights have the opportunity to extract private benefits and act in their personal interest; as a result, dual-class firms are often perceived to have low transparency and high information asymmetry. This dissertation investigates the quality of information and the information environment of firms with two classes of stock. The first essay examines the quality of information by studying accruals in dual-class firms in comparison to firms with only one class of stock. The results suggest that the quality of accruals is better in dual-class firms than in single-class firms. In addition, the difference in the quality of accruals between firms that abolish their dual-class share structure by unification and singe-class firms disappears in the post-unification period. The second essay investigates the earnings informativeness of dual-class firms by examining the explanatory power of earnings for returns. The results indicate that the earnings informativeness is lower for dual-class firms as compared to single-class firms. Earnings informativeness improves in firms that unify their shares. The third essay compares the level of information asymmetry between dual-class firms and single-class firms. It is documented that the information environment for dual-class firms is worse than for single-class firms. Also, the finding suggests that the difference in information environment between dual-class firms and single-class firms disappears after dual-class stock unification.
Resumo:
Dual-class stock structure is characterized by the separation of voting rights and cash flow rights. The departure from a common “one share-one vote” configuration creates ideal conditions for conflicts of interest and agency problems between controlling insiders (the holders of voting rights) and remaining shareholders. The owners of voting rights have the opportunity to extract private benefits and act in their personal interest; as a result, dual-class firms are often perceived to have low transparency and high information asymmetry. This dissertation investigates the quality of information and the information environment of firms with two classes of stock. The first essay examines the quality of information by studying accruals in dual-class firms in comparison to firms with only one class of stock. The results suggest that the quality of accruals is better in dual-class firms than in single-class firms. In addition, the difference in the quality of accruals between firms that abolish their dual-class share structure by unification and singe-class firms disappears in the post-unification period. The second essay investigates the earnings informativeness of dual-class firms by examining the explanatory power of earnings for returns. The results indicate that the earnings informativeness is lower for dual-class firms as compared to single-class firms. Earnings informativeness improves in firms that unify their shares. The third essay compares the level of information asymmetry between dual-class firms and single-class firms. It is documented that the information environment for dual-class firms is worse than for single-class firms. Also, the finding suggests that the difference in information environment between dual-class firms and single-class firms disappears after dual-class stock unification.
Resumo:
The search-experience-credence framework from economics of information, the human-environment relations models from environmental psychology, and the consumer evaluation process from services marketing provide a conceptual basis for testing the model of "Pre-purchase Information Utilization in Service Physical Environments." The model addresses the effects of informational signs, as a dimension of the service physical environment, on consumers' perceptions (perceived veracity and perceived performance risk), emotions (pleasure) and behavior (willingness to buy). The informational signs provide attribute quality information (search and experience) through non-personal sources of information (simulated word-of-mouth and non-personal advocate sources).^ This dissertation examines: (1) the hypothesized relationships addressed in the model of "Pre-purchase Information Utilization in Service Physical Environments" among informational signs, perceived veracity, perceived performance risk, pleasure, and willingness to buy, and (2) the effects of attribute quality information and sources of information on consumers' perceived veracity and perceived performance risk.^ This research is the first in-depth study about the role and effects of information in service physical environments. Using a 2 x 2 between subjects experimental research procedure, undergraduate students were exposed to the informational signs in a simulated service physical environment. The service physical environments were simulated through color photographic slides.^ The results of the study suggest that: (1) the relationship between informational signs and willingness to buy is mediated by perceived veracity, perceived performance risk and pleasure, (2) experience attribute information shows higher perceived veracity and lower perceived performance risk when compared to search attribute information, and (3) information provided through simulated word-of-mouth shows higher perceived veracity and lower perceived performance risk when compared to information provided through non-personal advocate sources. ^
Resumo:
Today, many organizations are turning to new approaches to building and maintaining information systems (I/S) to cope with a highly competitive business environment. Current anecdotal evidence indicates that the approaches being used improve the effectiveness of software development by encouraging active user participation throughout the development process. Unfortunately, very little is known about how the use of such approaches enhances the ability of team members to develop I/S that are responsive to changing business conditions.^ Drawing from predominant theories of organizational conflict, this study develops and tests a model of conflict among members of a development team. The model proposes that development approaches provide the relevant context conditioning the management and resolution of conflict in software development which, in turn, are crucial for the success of the development process.^ Empirical testing of the model was conducted using data collected through a combination of interviews with I/S executives and surveys of team members and business users at nine organizations. Results of path analysis provide support for the model's main prediction that integrative conflict management and distributive conflict management can contribute to I/S success by influencing differently the manifestation and resolution of conflict in software development. Further, analyses of variance indicate that object-oriented development, when compared to rapid and structured development, appears to produce the lowest levels of conflict management, conflict resolution, and I/S success.^ The proposed model and findings suggest academic implications for understanding the effects of different conflict management behaviors on software development outcomes, and practical implications for better managing the software development process, especially in user-oriented development environments. ^
Resumo:
The primary purpose of this research is to study the linkage between perceived job design characteristics and information system environment characteristics before and after the replacement of a legacy information system with a new type of information system (referred to as an Enterprise Resource Planning or ERP system). A public state University implementing an academic version of an ERP system was selected for the study. Three survey instruments were used to examine the perception of the information system, the job characteristics, and the organizational culture before and after the system implementation. The research participants included two large departments resulting in a sample of 130 workers. Research questions were analyzed using multivariate procedures including factor analysis, path analysis, step-wise regression, and matched pair analysis. ^ Results indicated that the ERP system has introduced new elements into the working environment that has changed the perception of how the job design characteristics and organization culture dimensions are viewed by the workers. The understanding of how the perceived system characteristics align with an individual's perceived job design characteristics is supported by each of the system characteristics significantly correlated in the proposed direction. The stronger support of this relationship becomes visible in the causal flow of the effects seen in the path diagram and in the step-wise regression. The perceived job design characteristics aligning with dimensions of organizational culture are not as strong as the literature suggests. Although there are significant correlations between the job and culture variables, only one relationship can be seen in the causal flow. ^ This research has demonstrated that system characteristics of ERP do contribute to the perception of change in an organization and do support organizational culture behaviors and job characteristics. ^
Resumo:
Security remains a top priority for organizations as their information systems continue to be plagued by security breaches. This dissertation developed a unique approach to assess the security risks associated with information systems based on dynamic neural network architecture. The risks that are considered encompass the production computing environment and the client machine environment. The risks are established as metrics that define how susceptible each of the computing environments is to security breaches. ^ The merit of the approach developed in this dissertation is based on the design and implementation of Artificial Neural Networks to assess the risks in the computing and client machine environments. The datasets that were utilized in the implementation and validation of the model were obtained from business organizations using a web survey tool hosted by Microsoft. This site was designed as a host site for anonymous surveys that were devised specifically as part of this dissertation. Microsoft customers can login to the website and submit their responses to the questionnaire. ^ This work asserted that security in information systems is not dependent exclusively on technology but rather on the triumvirate people, process and technology. The questionnaire and consequently the developed neural network architecture accounted for all three key factors that impact information systems security. ^ As part of the study, a methodology on how to develop, train and validate such a predictive model was devised and successfully deployed. This methodology prescribed how to determine the optimal topology, activation function, and associated parameters for this security based scenario. The assessment of the effects of security breaches to the information systems has traditionally been post-mortem whereas this dissertation provided a predictive solution where organizations can determine how susceptible their environments are to security breaches in a proactive way. ^
Resumo:
This study explored the strategies that community-based, consumer-focused advocacy, alternative service organizations (ASOs), implemented to adapt to the changes in the nonprofit funding environment (Oliver & McShane, 1979; Perlmutter, 1988a, 1994). It is not clear as to the extent to which current funding trends have influenced ASOs as little empirical research has been conducted in this area (Magnus, 2001; Marquez, 2003; Powell, 1986). ^ This study used a qualitative research design to investigate strategies implemented by these organizations to adapt to changes such as decreasing government, foundation, and corporate funding and an increasing number of nonprofit organizations. More than 20 community informants helped to identify, locate, and provide information about ASOs. Semi-structured interviews were conducted with a sample of 30 ASO executive directors from diverse organizations in Miami-bade and Broward Counties, in South Florida. ^ Data analysis was facilitated by the use of ATLAS.ti, version 5, a qualitative data analysis computer software program designed for grounded theory research. This process generated five major themes: Funding Environment; Internal Structure; Strategies for Survival; Sustainability; and Committing to the Cause, Mission, and Vision. ^ The results indicate that ASOs are struggling to survive financially by cutting programs, decreasing staff, and limiting service to consumers. They are also exploring ways to develop fundraising strategies; for example, increasing the number of proposals written for grants, focusing on fund development, and establishing for-profit ventures. Even organizations that state that they are currently financially stable are concerned about their financial vulnerability. There is little flexibility or cushioning to adjust to "funding jolts." The fear of losing current funding levels and being placed in a tenuous financial situation is a constant concern for these ASOs. ^ Further data collected from the self-administered Funding Checklist and demographic forms were coded and analyzed using Statistical Package for the Social Sciences (SPSS). Descriptive information and frequencies generated findings regarding the revenue, staff compliment, use of volunteers and fundraising consultants, and fundraising practices. The study proposes a model of funding relationships and presents implications for social work practice, and policy, along with recommendations for future research. ^
Resumo:
Objectionable odors remain at the top of air pollution complaints in urban areas such as Broward County that is subject to increasing residential and industrial developments. The odor complaints in Broward County escalated by 150 percent for the 2001 to 2004 period although the population increased by only 6 percent. It is estimated that in 2010 the population will increase to 2.5 million. Relying solely on enforcing the local odor ordinance is evidently not sufficient to manage the escalating odor complaint trends. An alternate approach similar to odor management plans (OMPs) that are successful in managing major malodor sources such as animal farms is required. ^ This study aims to develop and determine the feasibility of implementing a comprehensive odor management plan (COMP) for the entire Broward County. Unlike existing OMPs for single sources where the receptors (i.e. the complainants) are located beyond the boundary of the source, the COMP addresses a complex model of multiple sources and receptors coexisting within the boundary of the entire county. Each receptor is potentially subjected to malodor emissions from multiple sources within the county. Also, the quantity and quality of the source/receptor variables are continuously changing. ^ The results of this study show that it is feasible to develop a COMP that adopts a systematic procedure to: (1) Generate maps of existing odor complaint areas and malodor sources, (2) Identify potential odor sources (target sources) responsible for existing odor complaints, (3) Identify possible odor control strategies for target sources, (4) Determine the criteria for implementing odor control strategies, (5) Develop an odor complaint response protocol, and (6) Conduct odor impact analyses for new sources to prevent future odor related issues. Geographic Information System (GIS) is used to identify existing complaint areas. A COMP software that incorporates existing United States Environmental Protection Agency (EPA) air dispersion software is developed to determine the target sources, predict the likelihood of new complaints, and conduct odor impact analysis. The odor response protocol requires pre-planning field investigations and conducting surveys to optimize the local agency available resources while protecting the citizen's welfare, as required by the Clean Air Act. ^
Resumo:
This research, conducted in 2006-2008, examines the ways in which various groups involved with the marine resources of Seward, Alaska construct attitudes towards the environment. Participant observation and semi-structured interviews are used to assess how commercial halibut fishers, tour boat operators, local residents and government officials understand the marine environment based on their previous experiences. This study also explores how ideologies relate to the current practices of each group. Two theories orient the analyses: The first, cultural modeling provided a theoretical and methodological framework for pursuing a more comprehensive analysis of resource management. The second, Theory of Reasoned Action (Ajzen and Fishbein 1980), guided the analysis of the ways in which each participant’s ideology towards the marine environment relates to their practice. Aside from contributing to a better understanding of a coastal community’s ideologies and practices, this dissertation sought to better understand the role of ecological ideologies and behaviors in fisheries management. The research illustrates certain domains where ideologies and practices concerning Pacific halibut and the marine environment differ among commercial fishers, government, and management officials, tour boat operators and residents of Seward, AK. These differences offer insights into how future collaborative efforts between government officials, managers and local marine resource users might better incorporate local ideology into management, and provide ecological information to local marine resource users in culturally appropriate ways.
Resumo:
Since multimedia data, such as images and videos, are way more expressive and informative than ordinary text-based data, people find it more attractive to communicate and express with them. Additionally, with the rising popularity of social networking tools such as Facebook and Twitter, multimedia information retrieval can no longer be considered a solitary task. Rather, people constantly collaborate with one another while searching and retrieving information. But the very cause of the popularity of multimedia data, the huge and different types of information a single data object can carry, makes their management a challenging task. Multimedia data is commonly represented as multidimensional feature vectors and carry high-level semantic information. These two characteristics make them very different from traditional alpha-numeric data. Thus, to try to manage them with frameworks and rationales designed for primitive alpha-numeric data, will be inefficient. An index structure is the backbone of any database management system. It has been seen that index structures present in existing relational database management frameworks cannot handle multimedia data effectively. Thus, in this dissertation, a generalized multidimensional index structure is proposed which accommodates the atypical multidimensional representation and the semantic information carried by different multimedia data seamlessly from within one single framework. Additionally, the dissertation investigates the evolving relationships among multimedia data in a collaborative environment and how such information can help to customize the design of the proposed index structure, when it is used to manage multimedia data in a shared environment. Extensive experiments were conducted to present the usability and better performance of the proposed framework over current state-of-art approaches.
Resumo:
This dissertation analyzes the obstacles against further cooperation in international economic relations. The first essay explains the gradual nature of trade liberalization. I show that existence of asymmetric information between governments provides a sufficient reason for gradualism to exist. Governments prefer starting small to reduce the cost of partner’s betrayal when there is sufficient degree of information asymmetry regarding the partner’s type. Learning about partner’s incentive structure enhances expectations, encouraging governments to increase their current level of cooperation. Specifically, the uninformed government’s subjective belief for the trading partner being good is improved as the partner acts cooperatively. This updated belief, in turn, lowers the subjective probability of future betrayal, enabling further progress in cooperation. The second essay analyzes the relationship between two countries facing two policy dilemmas in an environment with two way goods and capital flows. When issues are independent and countries are symmetric, signing separate agreements for tariffs (Free Trade Agreements-FTA) and for taxes (Tax Treaties-TT) provides the identical level of enforcement as signing a linked agreement. However, linkage can still improve the joint welfare by transferring the slack enforcement power in a case of asymmetric issues or countries. I report non-results in two cases where the policy issues are interconnected due to technological spillover effect of FDI. Moreover, I show that linking the agreements actually reduces enforcement when agreements are linked under a limited punishment rule and policy variables are strategic substitutes. The third essay investigates the welfare/enforcement consequences of linking trade and environmental agreements. In the standard literature, linking the agreements generate non-trivial results only when there is structural relation between the issues. I focus on institutional design of the linkage and show that even if environmental aspects of international trade are negligible linking the agreements might still have some interesting welfare implications under current GATT Rules. Specifically, when traded goods are substitutes in consumption, linking the environmental agreement with trade agreement under the Withdrawal of Equivalent Concession Rule (Article XXVIII) will reduce the enforcement. However, enforcement in environmental issue increases when the same rule is implemented in the absence of linkage.
Resumo:
Road pricing has emerged as an effective means of managing road traffic demand while simultaneously raising additional revenues to transportation agencies. Research on the factors that govern travel decisions has shown that user preferences may be a function of the demographic characteristics of the individuals and the perceived trip attributes. However, it is not clear what are the actual trip attributes considered in the travel decision- making process, how these attributes are perceived by travelers, and how the set of trip attributes change as a function of the time of the day or from day to day. In this study, operational Intelligent Transportation Systems (ITS) archives are mined and the aggregated preferences for a priced system are extracted at a fine time aggregation level for an extended number of days. The resulting information is related to corresponding time-varying trip attributes such as travel time, travel time reliability, charged toll, and other parameters. The time-varying user preferences and trip attributes are linked together by means of a binary choice model (Logit) with a linear utility function on trip attributes. The trip attributes weights in the utility function are then dynamically estimated for each time of day by means of an adaptive, limited-memory discrete Kalman filter (ALMF). The relationship between traveler choices and travel time is assessed using different rules to capture the logic that best represents the traveler perception and the effect of the real-time information on the observed preferences. The impact of travel time reliability on traveler choices is investigated considering its multiple definitions. It can be concluded based on the results that using the ALMF algorithm allows a robust estimation of time-varying weights in the utility function at fine time aggregation levels. The high correlations among the trip attributes severely constrain the simultaneous estimation of their weights in the utility function. Despite the data limitations, it is found that, the ALMF algorithm can provide stable estimates of the choice parameters for some periods of the day. Finally, it is found that the daily variation of the user sensitivities for different periods of the day resembles a well-defined normal distribution.
Resumo:
With the exponential increasing demands and uses of GIS data visualization system, such as urban planning, environment and climate change monitoring, weather simulation, hydrographic gauge and so forth, the geospatial vector and raster data visualization research, application and technology has become prevalent. However, we observe that current web GIS techniques are merely suitable for static vector and raster data where no dynamic overlaying layers. While it is desirable to enable visual explorations of large-scale dynamic vector and raster geospatial data in a web environment, improving the performance between backend datasets and the vector and raster applications remains a challenging technical issue. This dissertation is to implement these challenging and unimplemented areas: how to provide a large-scale dynamic vector and raster data visualization service with dynamic overlaying layers accessible from various client devices through a standard web browser, and how to make the large-scale dynamic vector and raster data visualization service as rapid as the static one. To accomplish these, a large-scale dynamic vector and raster data visualization geographic information system based on parallel map tiling and a comprehensive performance improvement solution are proposed, designed and implemented. They include: the quadtree-based indexing and parallel map tiling, the Legend String, the vector data visualization with dynamic layers overlaying, the vector data time series visualization, the algorithm of vector data rendering, the algorithm of raster data re-projection, the algorithm for elimination of superfluous level of detail, the algorithm for vector data gridding and re-grouping and the cluster servers side vector and raster data caching.