Knowledge

Social information seeking

Source 📝

129:
their status messages to ask a question, which indicated that Q&A on social networks is popular. Also, the types of questions people asked include recommendation, opinion, factual knowledge, rhetorical, etc. And motivations for asking include trust, asking subjective questions, etc. Their analysis also explored the relationships between answer speed and quality, questions’ property and participants’ property. Only a very small portion (6.5%) of the questions were answered, but the 89.3% of the respondents were satisfied with the response time they experienced even though there's a discrepancy between that and expectation. Also, the responses gathered via social networks appear to be very valuable. Their findings implied design for search tools that could combine the speed and breadth of traditional search engines with the trustworthiness, personalization, and the high engagement of social media Q&A.
133:
asked personal and health-related questions (11%). Only 18.7% questions received response, while a handful of questions received a high number of responses. The larger the askers’ network, the more responses she received; however, posting more tweets or posting more frequently did not increase chances of receiving a response. Most often the “follow” relationship between asker and answerer was one-way. Paul et al. also examined what factors of the askers would increase the chance of getting a response and found that more relevant responses are received when there is a mutual relationship between askers and answerers. Intuitively, we would expect this, as mutual relationship would indicate stronger tie strength and hence, more number of relevant answers.
79:
question answering function perhaps since the advent of Usenet and Bulletin Board Systems, so in one sense cQA is nothing new. Websites dedicated to cQA, however, have emerged on the web only within the past few years: the first cQA site was the Korean Naver Knowledge iN, launched in 2002, while the first English-language CQA site was Answerbag, launched in April 2003. Despite this short history, however, cQA has already attracted a great deal of attention from researchers investigating information seeking behaviors, selection of resources, social annotations, user motivations, comparisons with other types of question answering services, and a range of other information-related behaviors.
212:
Pal et al. designed features to measure a user's authority on a certain topic. For example, retweet impact refers to how many times a certain user has been retweeted on a certain topic. The impact is dampened by a factor measuring how many times the user had been retweeted by a unique author to avoid
208:
An initial analysis of the three aforementioned metrics showed that the users with the highest indegrees and the users with the highest retweet/mention counts were not the same. The top 1% of users by indegree are shown to have very low correlation with the same percentile of users by retweets and by
124:
Friendsourcing is an important component of social question and answering, including how to route questions to friends or others who will most likely answer the question. The important questions include what people's behaviors are in social networks, especially what kinds of questions people ask from
180:
Gray et al. (2013) explored how bridging social capital, question type and relational closeness influence the perceived usefulness and satisfaction of information obtained through questions asked on Facebook. Their results indicated that bridging social capital could positively predict the perceived
163:
Sometimes, only asking question from people's own social networks or friends is not enough. If the question is obscure or time sensitive, no members of their social networks may know the answer. For example, this person's friends might not have expertise in providing evaluations for a specific model
189:
In order to recommend the most appropriate users to provide answers in a social network, we need to find approaches to detect users' authority in a social network. In the field of information retrieval, there has been a trend of research investigating ways to detect users' authority effectively and
167:
Nichols and Kang (2012) leveraged Twitter for question and answering with targeted strangers by taking advantage of its public accessibility. In their approach, they mined the public status updates posted on Twitter to find strangers with potentially useful information, and send questions to these
132:
Paul et al. (2011) did a study on question and answering on Twitter, and found that out of the 1152 questions they examined, the most popular question types asked on Twitter were rhetorical (42%) and factual (16%). Surprisingly, along with entertainment (29%) and technology (29%) questions, people
128:
Morris et al. (2010) conducted a survey of question and answering within social networks with 624 people, and gathered detailed data about the behavior of Q&A, including frequency, types of questions and answers, and motivations. They found that half (50.6%) of respondents reported having used
55:
Social information seeking is often materialized in online question-answering (QA) websites, which are driven by a community. Such QA sites have emerged in the past few years as an enormous market, so to speak, for the fulfillment of information needs. Estimates of the volume of questions answered
150:
These social networks support various friendsourcing behavior, provide information benefits that oftentimes traditional search tools cannot, and also may reinforce social bonds through the process. However, there are many questions and limitations that may prevent people from asking questions on
154:
Rzeszotarski and Morris (2014) took a novel approach to explore the perceived social costs of friendsourcing on Twitter via monetary choices. They modeled friendsourcing costs across users, and compared it with crowdsourcing on Amazon Mechanical Turk. Their findings suggested interesting design
78:
Social Q&A or cQA, according to Shah et al., consists of three components: a mechanism for users to submit questions in natural language, a venue for users to submit answers to questions, and a community built around this exchange. Viewed in that light, online communities have performed a
347:
Su, Q., Pavlov, D., Chow, J., & Baker, W. (2007). Internet-scale collection of human- reviewed data. In C. L. Williamson, M. E. Zurko, P. E. Patel-Schneider, & P. J. Shenoy (Eds.), Proceedings of the 16th International Conference on World Wide Web (pp. 231−240). New York:
213:
the cases when a user has fans who retweet regardless of the content. They first used a clustering approach to find the target cluster which has the highest average score across all features, and used a ranking algorithm to find the most authoritative users within the cluster.
141:
Existing social Q&A services can be characterized from the three perspectives, by the definition of social Q&A as a service involving (1) a method for presenting information needs, (2) a place for responding to information need, and (3) participation as a community.
56:
are difficult to come by, but it is likely that the number of questions answered on social/community QA (cQA) sites far exceeds the number of questions answered by library reference services, which until recently were one of the few institutional sources for such
282:
Wang, G., Gill, K., Mohanlal, M., Zheng, H., & Zhao, B. Y. (2013, May). Wisdom in the social crowd: an analysis of Quora. In Proceedings of the 22nd international conference on World Wide Web (pp. 1341-1352). International World Wide Web Conferences Steering
60:. cQA sites make their content – questions and associated answers submitted on the site – available on the open web, and indexable by search engines, thus enabling web users to find answers provided for previously asked questions in response to new queries. 301:
Kim, S., Oh, J-S., & Oh, S. (2007). Best-Answer Selection Criteria in a Social Q&A site from the User Oriented Relevance Perspective. Proceedings of the 70th Annual Meeting of the American Society for Information Science and Technology (ASIST ‘07),
311:
Harper, M. F., Raban, D. R., Rafaeli, S., & Konstan, J. K. (2008). Predictors of answer quality in online Q&A sites. In Proceedings of the 26th Annual SIGCHI Conference on Human Factors in Computing Systems (pp. 865−874). New York:
176:
Another important and unique component of social Q&A system is that it is a community which allows members to form relationships and bonds, so that their behavior in these social Q&A services will also add to their social capital.
193:
Cha et al. investigate possible metrics for determining authority users on popular social network Twitter. They propose the following three simple network-based metrics and discuss their usefulness in determining a user's influence.
168:
strangers to collect responses. As a feasibility study, they collected information regarding response rate, and response time. 42% of users responded to questions from strangers, and 44% of the responses arrived within 30 minutes.
181:
utility of the acquired information, meaning that information exchanges on social networks is an effective way of social capital conversion. Also, useful answers are more likely to be received from weak ties than strong ties.
151:
their social networks. For example, they may feel uncomfortable asking questions that are too private, might not want to cost too much other people's time and effort, or might feel the burden of social debts.
75:(computing). StackOverflow has 3.45 million questions, 1.3 million users and over 6.86 million answers since July 2008 while Quora has 437 thousand questions, 264 thousand users and 979 thousand answers. 581:
Pal, A., & Counts, S. (2011, February). Identifying topical authorities in microblogs. In Proceedings of the fourth ACM international conference on Web search and data mining (pp. 45-54). ACM.
112:
Why do they spend time and effort to find information and help others online? Why are they willing to expose their personal stories to people and inform others with their experiences?
116:
Shah et al. provide a detailed research agenda for social Q&A. A new book by Shah presents a more recent and comprehensive information pertaining to social information seeking.
164:
of digital camera. Also asking the current wait time for security at the local airport might not be possible if none of this person's friends are currently at the airport.
19:
is a field of research that involves studying situations, motivations, and methods for people seeking and sharing information in participatory online social sites, such as
334:
Shah, C., Oh, J. S., & Oh, S. (2008). Exploring characteristics and effects of user participation in online social Q&A sites. First Monday, 13(9). Available from
63:
The popularity of such sites have been increasing dramatically for the past several years. Major sites that provide a general platform for questions of all types include
357:
Shah, C., Oh, S., & Oh, J. S. (2009). Research agenda for social Q&A. Library & Information Science Research, 31(4), 205-209. Retrieved January 2, 2011.
572:
Cha, M., Haddadi, H., Benevenuto, F., & Gummadi, P. K. (2010). Measuring User Influence in Twitter: The Million Follower Fallacy. ICWSM, 10(10-17), 30.
209:
mentions. This implies that follower count is not useful in determining whether a user's tweets get retweeted or whether the other users engage with them.
366:
Shah, C. (2017). Social Information Seeking: Leveraging the Wisdom of the Crowd. The Information Retrieval (IR) series. Berlin, Germany: Springer.
155:
considerations for minimizing social cost by building a hybrid system combining friendsourcing and crowdsourcing with microtask markets.
437:
Paul, S. A., Hong, L., & Chi, E. H. (2011, May). Is Twitter a Good Place for Asking Questions? A Characterization Study. In ICWSM.
292:
Shah, C., Oh, S., & Oh, J-S. (2009). Research Agenda for Social Q&A. Library and Information Science Research, 11(4), 205-209.
397:
Morris, Meredith Ringel; Teevan, Jaime; Panovich, Katrina (January 1, 2010). "What do people ask their social networks, and why?".
637: 371: 258: 216:
With these authority detection methods, social Q&A could be more effective in providing accurate answers to askers.
125:
their social networks and how different question types affect the frequency, speed and quality of answers they receive.
447:
Rzeszotarski, Jeffrey M.; Morris, Meredith Ringel (January 1, 2014). "Estimating the social costs of friendsourcing".
387:. 21st International Conference on Collaboration and Technology (CRIWG 2015). Yerevan, Armenia. 22 – 25 Sept., p.72-85 549: 505: 464: 414: 488:
Nichols, Jeffrey; Kang, Jeon-Hyung (January 1, 2012). "Asking questions of targeted strangers on social networks".
335: 321:
Gazan, R. (2008). Social annotations in digital library collections. D-Lib Magazine, 11/12(14). Available from
31:
as well as building systems for supporting such activities. Highly related topics involve traditional and
273:
Janes, J. (2003). The Global Census of Digital Reference. In 5th Annual VRD Conference. San Antonio, TX.
384: 44: 109:
Why are the answerers willing to share information and knowledge with anonymous people, for free?
40: 36: 532:; Ellison, Nicole B.; Vitak, Jessica; Lampe, Cliff (January 1, 2013). "Who wants to know?". 100:
Why do they ask questions online to people whose background or expertise may be unverified?
8: 322: 555: 511: 470: 420: 57: 545: 501: 460: 410: 367: 254: 32: 559: 537: 515: 493: 474: 452: 424: 402: 246: 235:"Students' Information Seeking Behavior in Online Environment Using Web 2.0 Tools" 87:
Some of the interesting and important research questions in this area include:
64: 20: 234: 631: 529: 490:
Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
250: 103:
Why do they choose social Q&A over other sources to look for information?
72: 607: 541: 497: 456: 406: 336:
http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2182/2028
612: 449:
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
399:
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
534:
Proceedings of the 2013 conference on Computer supported cooperative work
97:
Why do questioners choose social Q&A as a source to find information?
24: 385:
Analysis of Question and Answering Behavior in Question Routing Services
622: 597: 106:
What do they expect from answers given by anonymous people on the Web?
94:
What is the motivation of people who participate in social Q&A?
71:. While other sites that focus on particular fields; for example, 602: 28: 617: 68: 593:
People associated with social information seeking include:
536:. CSCW '13. New York, NY, USA: ACM. pp. 1213–1224. 492:. CSCW '12. New York, NY, USA: ACM. pp. 999–1002. 451:. CHI '14. New York, NY, USA: ACM. pp. 2735–2744. 401:. CHI '10. New York, NY, USA: ACM. pp. 1739–1748. 323:
http://www.dlib.org/dlib/november08/gazan/11gazan.html
184: 528: 396: 91:
What causes people to be involved in social Q&A?
119: 446: 158: 629: 171: 145: 487: 136: 232: 630: 82: 185:Authority detection in social media 13: 383:Liu, Z., and Jansen, B. J. (2015) 233:Čižmešija, Antonela (March 2018). 14: 649: 588: 190:accurately in a social network. 120:Friendsourcing in social Q&A 575: 566: 522: 481: 440: 431: 390: 377: 360: 351: 159:Responding to information needs 341: 328: 315: 305: 295: 286: 276: 267: 226: 1: 638:Social information processing 219: 50: 172:Participation as a community 146:Presenting information needs 7: 10: 654: 198:indegree (followers count) 17:Social information seeking 251:10.21125/inted.2018.1636 45:knowledge representation 542:10.1145/2441776.2441913 498:10.1145/2145204.2145352 457:10.1145/2556288.2557181 407:10.1145/1753326.1753587 137:Social Q&A services 41:information extraction 239:INTED2018 Proceedings 37:information retrieval 245:. IATED: 6973–6983. 83:Research questions 58:question answering 618:Jeffrey Pomerantz 372:978-3-319-56756-3 260:978-84-697-9480-7 33:virtual reference 645: 598:Eugene Agichtein 582: 579: 573: 570: 564: 563: 526: 520: 519: 485: 479: 478: 444: 438: 435: 429: 428: 394: 388: 381: 375: 364: 358: 355: 349: 345: 339: 332: 326: 319: 313: 309: 303: 299: 293: 290: 284: 280: 274: 271: 265: 264: 230: 67:, Answerbag and 653: 652: 648: 647: 646: 644: 643: 642: 628: 627: 591: 586: 585: 580: 576: 571: 567: 552: 527: 523: 508: 486: 482: 467: 445: 441: 436: 432: 417: 395: 391: 382: 378: 365: 361: 356: 352: 346: 342: 333: 329: 320: 316: 310: 306: 300: 296: 291: 287: 281: 277: 272: 268: 261: 231: 227: 222: 187: 174: 161: 148: 139: 122: 85: 53: 12: 11: 5: 651: 641: 640: 626: 625: 620: 615: 610: 605: 600: 590: 589:External links 587: 584: 583: 574: 565: 550: 521: 506: 480: 465: 439: 430: 415: 389: 376: 359: 350: 340: 327: 314: 304: 294: 285: 275: 266: 259: 224: 223: 221: 218: 206: 205: 202: 199: 186: 183: 173: 170: 160: 157: 147: 144: 138: 135: 121: 118: 114: 113: 110: 107: 104: 101: 98: 95: 92: 84: 81: 65:Yahoo! Answers 52: 49: 21:Yahoo! Answers 9: 6: 4: 3: 2: 650: 639: 636: 635: 633: 624: 621: 619: 616: 614: 611: 609: 606: 604: 601: 599: 596: 595: 594: 578: 569: 561: 557: 553: 551:9781450313315 547: 543: 539: 535: 531: 530:Gray, Rebecca 525: 517: 513: 509: 507:9781450310864 503: 499: 495: 491: 484: 476: 472: 468: 466:9781450324731 462: 458: 454: 450: 443: 434: 426: 422: 418: 416:9781605589299 412: 408: 404: 400: 393: 386: 380: 374:. (187 pages) 373: 369: 363: 354: 344: 337: 331: 324: 318: 308: 298: 289: 279: 270: 262: 256: 252: 248: 244: 240: 236: 229: 225: 217: 214: 210: 204:mention count 203: 201:retweet count 200: 197: 196: 195: 191: 182: 178: 169: 165: 156: 152: 143: 134: 130: 126: 117: 111: 108: 105: 102: 99: 96: 93: 90: 89: 88: 80: 76: 74: 73:StackOverflow 70: 66: 61: 59: 48: 46: 42: 38: 34: 30: 26: 23:, Answerbag, 22: 18: 592: 577: 568: 533: 524: 489: 483: 448: 442: 433: 398: 392: 379: 362: 353: 343: 330: 317: 307: 297: 288: 278: 269: 242: 238: 228: 215: 211: 207: 192: 188: 179: 175: 166: 162: 153: 149: 140: 131: 127: 123: 115: 86: 77: 62: 54: 16: 15: 623:Chirag Shah 608:Jung Sun Oh 25:WikiAnswers 613:Sanghee Oh 603:Rich Gazan 220:References 51:Background 35:services, 283:Committee 632:Category 560:1628919 516:2143126 475:6622258 425:8797180 29:Twitter 558:  548:  514:  504:  473:  463:  423:  413:  370:  257:  43:, and 556:S2CID 512:S2CID 471:S2CID 421:S2CID 69:Quora 546:ISBN 502:ISBN 461:ISBN 411:ISBN 368:ISBN 348:ACM. 312:ACM. 255:ISBN 27:and 538:doi 494:doi 453:doi 403:doi 302:44. 247:doi 634:: 554:. 544:. 510:. 500:. 469:. 459:. 419:. 409:. 253:. 241:. 237:. 47:. 39:, 562:. 540:: 518:. 496:: 477:. 455:: 427:. 405:: 338:. 325:. 263:. 249:: 243:1

Index

Yahoo! Answers
WikiAnswers
Twitter
virtual reference
information retrieval
information extraction
knowledge representation
question answering
Yahoo! Answers
Quora
StackOverflow
"Students' Information Seeking Behavior in Online Environment Using Web 2.0 Tools"
doi
10.21125/inted.2018.1636
ISBN
978-84-697-9480-7
http://www.dlib.org/dlib/november08/gazan/11gazan.html
http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2182/2028
ISBN
978-3-319-56756-3
Analysis of Question and Answering Behavior in Question Routing Services
doi
10.1145/1753326.1753587
ISBN
9781605589299
S2CID
8797180
doi
10.1145/2556288.2557181
ISBN

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.