Exploring Hive Communities with HiveSQL
One of the powerful feature of Hive is its Communities. Today, I decided to explore Hive Communities using HiveSQL. Nothing too complicated. To start, I wanted to know how many communities have been created so far and how many subscribers each has. I would like to share how to do that with HiveSQL. This can help later to make more interesting queries like how active each communities are, how many posts and comments every day, and also how are posts and comments are being rewarded.
There are three Communities related tables in HiveSQL. They are Communities, CommunitiesSubscribers, and CommunitiesRoles. Today I will be using first two. CommunitiesSubscribers has two kind of information, which are community - name of the community and subscriber - name of the subscriber. Communities table has more columns. They are about, description, flag_text, language, name, nsfw, title, TS, type. Most of them are self-explanatory. Now confusion may appear with name and title.
By Hive blockchain convention when communities are created they are named like 'hive-1234567'. This is stored in name column of Communities table. It is also stored in community column of CommunitiesSubscribers table. Now, the title column in Communities tables stores the name of the Community given by the creator of the community. For example, OCD, GEMS, LeoFinance, Curie, etc.
Using the following query we can get the total number of communities that have subscribers.
SELECT COUNT(DISTINCT community)
FROM CommunitiesSubscribers
The result will be 2150 communities.
To get the the list of the communities with the amount of subscribers we will need to join two tables - Communities and CommunitiesSubscribers. As usual I will be using the HiveSQL queries within a python script. This will make it easy to output the result with a table markdown. Below is the list top 500 Hive communities ordered by the amount of subscribers.
After the list, at the end of the post you can also see and review the python code. Feel free to use it in your own scripts to explore HiveSQL and let me know if you have questions or suggestions.
import pyodbc
connection = pyodbc.connect('Driver={ODBC Driver 17 for SQL Server};'
'Server=vip.hivesql.io;'
'Database=DBHive;'
'uid=Hive-geekgirl;'
'pwd=XXXXXXXXXXXXXXX')
cursor = connection.cursor()
SQLCommand = '''
SELECT name, title, COUNT(subscriber)
FROM Communities
JOIN CommunitiesSubscribers
ON Communities.name = CommunitiesSubscribers.community
GROUP BY name, title
ORDER BY COUNT(subscriber) DESC
'''
result = cursor.execute(SQLCommand)
result = result.fetchmany(500)
connection.close()
text = '| # | Community | Subscribers | \n| ----------- | ----------- | ----------- | \n'
count = 1
for community in result:
text_line = '|' + str(count) +'| [' + community[1]
text_line += '](https://hive.blog/trending/' + community[0]
text_line += ') |' + str(community[2]) + ' | \n'
text = text + text_line
count += 1
print(text)
Posted Using LeoFinance Beta
https://twitter.com/geekjen/status/1353950136351694849
I'm pretty sure that dtube used account's keys to self-subscribe.
Kinda lame that there's two photography communities so close to one another.
I enjoy both communities and find that they trend different things.
The code is very well written. I don't think we can make it more efficient in any way.
I have downloaded the 2nd layer transaction details from Jan 1 to Jan 25 and stored it in 'Shelve' . Let me know if you want to make use of it .
!WINE
Posted Using LeoFinance Beta
Cheers, @amr008 You Successfully Shared 0.100 WINE With @geekgirl.
You Earned 0.100 WINE As Curation Reward.
You Utilized 1/3 Successful Calls.
WINE Current Market Price : 1.100 HIVE
GEMS no surprise the shit post community lol
Pretty surprised about the lifestyle and gaming one didn't know they were that popular, you can see how content is rather concentrated on rewards rather than categories though
Everyone knows that the best Hive Community is the Hive Near Guild.
https://peakd.com/c/hive-183607/created
Not many subscribers. Seems to be an awesome community.
Wowwww, that is very interesting and helpful! tHanK yoU!
Great Post. I'll be using HiveSQL. I'm glad to see Silver Gold Stackers in the list.
This is great