The following figure is from the article, and it provides a framework for thinking about students’ experience as a whole. I may adopt this model for my data analysis, though not data in every categorizes in this model can be found online.
First of all, I think we are very brave to try out one class remotely. It is very important experience for a class studying the Internet. I don’t know whether previous students in this class have tried remote classes or not, I hope Dr.V could let students in the future continue to try, although it may not be a pleasant experience. Hope students in the future could find a better software other than Adobe Connect to do this. Any revolution or progress will not happen if we always shy away from unpleasant possibilities. The key here is not to avoid the wrong things, but to know what is wrong, and make the wrong things right.
Second, regarding what is wrong, I think the key failure point is that each of us has too much freedom for ourselves, but not enough freedom for participating the class with Adobe Connect. We cannot see others if they are not the current presenter, and other people cannot see us if we are not presenting. Other people cannot hear us either if we do not press the “talk” button. We lost the supervision, and self-discipline doesn’t work all the time. Basically we could do anything else–eat, do homework for other subjects, check Facebook feed, etc. We had too much freedom, and I ate too many milk duds while listening to others’ talk. Adobe Connect may be perfect for people who have to attend boring conferences all the time, but definitely not for engaging students in class. Although all of us had talk priorities, we had to press the “talk” button at the risk of interfering the presenter, overloading the network and introducing lots of echos. So many of us would rather type in the chat box rather than speak. The social affordance of Adobe Connect doesn’t support free discussion as much as in Google+ Hangout.
Finally, Adobe Connect is designed for formal conferences, and Google+ Hangout is designed for informal friendly hangout. Experiences of using them for Tech621 made me realize they do have lots of differences depending on their specific purposes. Any such software in the future, if the purpose is for remote class, should consider how to engage students (let students want to and feel easy to participate) as the highest priority.
1. APA Citation:
There is a fair amount of discussion that Google search, Amazon recommendation and Facebook streams are too much personalized and making us closed minded, the so called filter bubble or echo chamber effect. I’ve been thinking this for a while, and I have a new idea about what makes personalization and recommendation bad.
It is not the machine, not the algorithms. It is human nature. The machine is learning from the human beings eventually, and the machine is just augmenting human reality. My argument is that for a person who has relatively open and balanced mind, the machine personalization results will be fairly balanced, and the recommendation results will serve good knowledge discovery. Only for persons who initially themselves have huge biased opinions and worldviews towards certain things, the machine personalization results will be biased. I don’t think the machine is making things worse, the machine is just reflecting the reality, and it is good that machine is making us see the reality so we can figure out a way to solve it. What results these biased and closed opinions are essentially human nature, and it’s not the machine’s responsibility to solve this problem.
Close minded people always exist, no matter whether Google search exists. Even if there is no Google search and other things, these people will not seek or listen opinions outside of their chamber what so ever. We dream that actually machine can solve this problem by providing opposite and diversity opinions (effort like Findory news), but the thing is not that easy, it is difficult to move people out of their comfort zone, so Findory failed.
To solve this problem, we have to open our mind first, or some of us. Then we figure out a way that could open other people’s mind more effectively, and make the machine do it.
To sum up, what I am arguing it that, at this moment, machine personalization and recommendation is not doing bad things (not that good either, just fact) it’s just reflecting human reality. What we have to realize is that the problem is not the machine, it’s ourselves, machine is just letting us see our flaws that we don’t want to face sometimes. In the future, we need to figure out ways to let the machine do good on this.
1. APA Citation
Social Discovery: the collaborative processes that promote creating capabilities and seeking solutions. “In its most ambitious form social discovery is the detection of new and important relationships, patterns or principles that advance disciplines and make valuable contributions to society.”
Propose a framework of design of search tools that support social discovery.
Based on previous theories of information seeking, work in CSCW and the Reader-to-Leader framework of social participation.
Figuire1. The Reader-to-Leader framework suggests that the typical path for social media participation moves from reading online content to making contributions, initially small edits, but growing into more substantive contributions. The user-gen- erated content can be edits to a wiki, comments in a discussion group, ratings of movies, photos, music, animations or videos. Collaborators work together over periods of weeks or months to make more substantial contributions, and leaders act to set policies, deal with problems, and mentor new users.
5. Main Findings
(1) The shift of searching tools: specific fact-finding–> open-ended exploratory search –> social discovery (collaboration in creating capabilities and seeking solutions)
(2) The Social Discovery framework. “The social discovery concept extends the ideas from the creativity and discovery support tools based on information visualization, team coordination and design tools.” It emphasize not only the information seeking, but also participation and creativity. “Valuable contributions also come from those who tag, taxonomize, comment, annotate, rank, rate, review and summarize.”
Figure 2. The Social Discovery framework describes the two stages of work: creating capacity and seeking solutions. These are carried out by a dialog between those who initiate requests and those who provide responses over a period of weeks and months.
5. Analysis: This is a theory paper from a computer scientist. Ben Sheinderman is a big figure in HCI and Information Visualization. He advocates the revolution of science to science 2.0 to really consider real social problems utilizing the web. This is a framework or guideline on design of computation tools that better support human and their social interaction in the processing of searching for knowledge. It is set to “promote thinking about and conducting research into the mechanisms that facilitate social discovery”. It mentions that “the implications are profound for academic, industrial and government researchers, since they force re-consideration of reward structures, especially for creating capabilities, which deserve more recognition in tenure or promotion reviews.” I am excited to see a big figure in computer science really embraces the idea of social media to do good for our humanity, and admire a lot of his thoughts and insight.
- APA Citation:
Willett, W., Heer, J., Hellerstein, J., & Agrawala, M. (2011). CommentSpace: structured support for collaborative visual analysis. Proceedings of the 2011 annual conference on Human factors in computing systems (pp. 3131–3140). PDF
- Purpose: (1) Present details of a web-based collaborative visual analysis system CommentSpace that aims to help users better make sense of the visualizations through synthesizing others’ comments. CommentSpace “enables analysts to annotate visualizations and apply two additional kinds of structure: 1) tags that consist of descriptive text attached to comments or views; and 2) links that denote relationships between two comments or between a comment and a specific visualization state or view. The resulting structure can help analysts navigate, organize, and synthesize the comments, and move beyond exploration to more complex analytical tasks. (2) Evaluate this system: “how a small, fixed vocabulary of tags (question, hypothesis, to-do) and links (evidence-for, evidence-against) can help analysts collect and organize new evidence, identify important findings made by others, and synthesize their findings” and “establish common ground”.
- Methods: (1) present technical details of the design of this system, and usage scenario (2) evaluate by two controlled user studies and a live deployment comparing CommentSpace with a similar system that doesn’t support tags and links.
- Main findings: (1) A small, fixed vocabulary of tags and links helps analysts more consistently and accurately classify evidence and establish comment ground. (2) Managing and incentivizing participation is important for analysts to progress from exploratory analysis to deeper analytical tasks. (3) Tags and links can help teams complete evidence gathering and synthesis tasks and that organizing comments using tags and links improves analytics results.
- Analysis: (1) This paper is from the “garden” of information visualization and visual analytics. This line of work (collaborative visual analytics) is drawn from and expanding into CSCW and social media research. Because computing systems are eventually serving people within their social contexts, also because of the popularity of the web, many technical systems are implemented on the web and thus seek to support people, their communication and collaboration. I see this emerging converging point between social media and visualization techniques, but there are still huge discrepancies in the way of thinking and doing among researchers in different disciplines (esp. computer science and social science). Traditionally, the way of conducting user studies in technical world usually lack of rigor or depth. “It was almost a joke in some technical domains that reviewers of papers just need to check the mental box of the existence of user studies without considering the quality”. Large part of those papers are dedicated to “fancy algorithms”. The future of social computing calls for close collaboration between computer scientists and social scientists, further more engineers, artists and designers. (2) This paper is related to my project of integrating user participation in rating, tagging and commenting academia papers. CommentSpace is designed as a modular softare that can run in conjunction with any interactive visualization system or website that treats each view of the data as a discrete state, so maybe I am looking forward to adopt it or some elements of it to my project in the future.
I wrote this at Valenti’s day this year. Still looks a brilliant post~