December 10, 2021

Clubhouse and audio rooms are all the rage now.
This post will take you through a step by step guide on how to build a basic Clubhouse like app using Flutter and 100ms' live audio video SDK in the following way -
You can also build a video conferencing app like Zoom with 100ms Flutter SDK.
By the end of this blog, this is how your app will look & feel like:

However, before proceeding, make sure that you have the following requirements:
Checkout our comprehensive guide on Flutter WebRTC
This tutorial assumes you have some prior knowledge of Flutter.
100ms is a real-time audio-video conferencing platform that provides you to quickly build a fully customizable audio-video engagement experience. It is quick to integrate with native/cross mobile and web SDKs.
It provides you with the following features:
If you love Android then here are some interesting reads:
Download the starter app containing all the prebuilt UI from here. Open it in your editor, build and run the app:



The file structure of the starter project looks like this:

main.dart: The entry point of the app.user_details_input_value.dart: The screen to get user details before joining the meeting.chat_view.dart: The chat screen to send messages to everyone in the room.token_service.dart: A helper service class to fetch the token to join a meeting.user.dart: A data model class for user details:class User {
final String userName;
final String? userRole;
final String userId;
User({required this.userName, required this.userRole, required this.userId});
}
In the next step, you’ll start setting up your project and initialize 100ms in it.
You’ll need the Token endpoint and App id, so get these credentials from the Developer Section:

Before creating a room, you need to create a new app:

Next, choose the Create your Own template and click on Create Roles:

You might be wondering why create your own roles? Because as we are creating a clubhouse clone there’ll be three kinds of peers joining a room or meeting, listeners, speakers, and moderators.
A role defines who can a peer see/hear, the quality at which they publish their video, whether they have permission to publish video/screen share, mute someone, change someone's role.
Clicking on Create Roles will open a pop up with a default role:

For the clubhouse clone, you’ll have 3 roles in your app:
Edit this role by clicking on the edit icon, changing the role name to the listener, and unchecking the publish strategies.

Add the speaker role by clicking on Add a Role, and uncheck Can share video & Can share screen however keep the Can share audio checked in the Publish strategies.
Similarly, do it for the moderator role along with Permissions update by keeping the Can change any participant's role & Can mute any participant checked.
After role creation, click on Set up App and your app is created:

Finally, go to Rooms in the dashboard and click on Create Room under your new custom app. Fill up the form and create the room:

Your room is created now ✨ :

N.B., Grab the Room Id to use it later to join the room.
Add the 100ms plugins in the pubspec dependencies as follows:
hmssdk_flutter: ^0.4.0
Add the 100ms Android SDK dependency in your app-level build.gradle file:
implementation 'com.github.100mslive:android-sdk:2.0.7'
Note: You might need to update your Android Kotlin version to the latest in your project-level build.gradle file:
ext.kotlin_version = '1.5.21'
Either get it using your IDE to install the plugins or use the below command for that:
flutter pub get
Update the Android SDK version to 21 or later by navigating to the android/app directory and updating the build.gradle:
minSdkVersion 21
You will require Recording Audio and Internet permission in this project as you are focused on the audio track in this tutorial.
A track represents either the audio or video that a peer is publishing
Add the permissions outside your application tag in your AndroidManifest file (android/app/src/main/AndroidManifest.xml):
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.INTERNET" />
Add the permissions to your Info.plist file:
<key>NSMicrophoneUsageDescription</key>
<string>{YourAppName} wants to use your microphone</string>
<key>NSLocalNetworkUsageDescription</key>
<string>{YourAppName} App wants to use your local network</string>
Now you are ready to join a room ✨.
You have to implement a listener class over the current SDK, this will help you interact with the SDK easily. So start by adding the following file in the meeting subfolder in lib:
100ms-flutter/meeting_store.dart at main · 100mslive/100ms-flutter (github.com)
The above contains an abstract class providing several methods to build a more advanced app. It uses the help of the meeting_controller.dart to interact with the HMS SDK.
Next, add the following code in meeting/hms_sdk_interactor.dart:
import 'package:hmssdk_flutter/hmssdk_flutter.dart';
class HMSSDKInteractor {
late HMSConfig config;
late List<HMSMessage> messages;
late HMSMeeting _meeting;
HMSSDKInteractor() {
_meeting = HMSMeeting();
}
Future<void> joinMeeting(
{required HMSConfig config,
required bool isProdLink,
required bool setWebRtcLogs}) async {
this.config = config;
await _meeting.joinMeeting(
config: this.config,
isProdLink: isProdLink,
// endPoint: Constant.getTokenURL,
setWebrtcLogs: setWebRtcLogs);
}
Future<void> leaveMeeting() async {
return await _meeting.leaveMeeting();
}
Future<void> switchAudio({bool isOn = false}) async {
return await _meeting.switchAudio(isOn: isOn);
}
Future<void> sendMessage(String message) async {
return await _meeting.sendMessage(message);
}
void addMeetingListener(HMSUpdateListener listener) {
_meeting.addMeetingListener(listener);
}
void removeMeetingListener(HMSUpdateListener listener) {
_meeting.removeMeetingListener(listener);
}
Future<bool> endRoom(bool lock) async {
bool ended = await _meeting.endRoom(lock);
return ended;
}
void removePeer(String peerId) {
_meeting.removePeer(peerId);
}
void changeRole(
{required String peerId,
required String roleName,
bool forceChange = true}) {
_meeting.changeRole(
peerId: peerId, roleName: roleName, forceChange: forceChange);
}
Future<List<HMSRole>> getRoles() async {
return _meeting.getRoles();
}
Future<bool> isAudioMute(HMSPeer? peer) async {
bool isMute = await _meeting.isAudioMute(peer);
return isMute;
}
void muteAll() {
_meeting.muteAll();
}
void unMuteAll() {
_meeting.unMuteAll();
}
}
The above class provides you with a lot of methods over the HMS SDK which will be later used here.
Next, create a meeting_controller file inside the meeting subfolder to interact with HMSSDKInteractor.
import 'package:clubhouse_clone/meeting/hms_sdk_interactor.dart';
import 'package:clubhouse_clone/models/user.dart';
import 'package:clubhouse_clone/services/token_service.dart';
import 'package:hmssdk_flutter/hmssdk_flutter.dart';
class MeetingController {
final String roomUrl;
final User user;
final HMSSDKInteractor? _hmsSdkInteractor;
MeetingController(
{required this.roomUrl, required this.user})
: _hmsSdkInteractor = HMSSDKInteractor();
Future<bool> joinMeeting() async {
String? token = await TokenService.getToken(
userId: user.userName, roomId: roomUrl, role: user.userRole);
if (token == null) return false;
HMSConfig config = HMSConfig(
userId: user.userId,
roomId: roomUrl,
authToken: token,
userName: user.userName,
);
await _hmsSdkInteractor?.joinMeeting(
config: config, isProdLink: true, setWebRtcLogs: true);
return true;
}
void leaveMeeting() {
_hmsSdkInteractor?.leaveMeeting();
}
Future<void> switchAudio({bool isOn = false}) async {
return await _hmsSdkInteractor?.switchAudio(isOn: isOn);
}
Future<void> sendMessage(String message) async {
return await _hmsSdkInteractor?.sendMessage(message);
}
void addMeetingListener(HMSUpdateListener listener) {
_hmsSdkInteractor?.addMeetingListener(listener);
}
void removeMeetingListener(HMSUpdateListener listener) {
_hmsSdkInteractor?.removeMeetingListener(listener);
}
void changeRole(
{required String peerId,
required String roleName,
bool forceChange = false}) {
_hmsSdkInteractor?.changeRole(
peerId: peerId, roleName: roleName, forceChange: forceChange);
}
Future<List<HMSRole>> getRoles() async {
return _hmsSdkInteractor!.getRoles();
}
Future<bool> isAudioMute(HMSPeer? peer) async {
bool isMute = await _hmsSdkInteractor!.isAudioMute(peer);
return isMute;
}
Future<bool> endRoom(bool lock) async {
return (await _hmsSdkInteractor?.endRoom(lock))!;
}
void removePeer(String peerId) {
_hmsSdkInteractor?.removePeer(peerId);
}
void unMuteAll() {
_hmsSdkInteractor?.unMuteAll();
}
void muteAll() {
_hmsSdkInteractor?.muteAll();
}
}
N.B., Make sure to generate the class using build_runner and mobx_codegen.
A room is the basic object that 100ms SDKs return on successful connection. This contains references to peers, tracks and everything you need to render a live a/v app. To join a room, you require an HMSConfig object, that’ll have the following fields:
First, you can cover up the requirements of userName and userId fields, by using the UserDetailsInputView widget to get the userName and role information using the usernameTextEditingController and userRoleTextEditingController TextEditingController :

You can then save this info in your User object on onPressed event:
User user = User(
userName: usernameTextEditingController.text,
userRole: userRoleTextEditingController.text,
userId: const Uuid().v1());
N.B., You need to add uuid package to generate the userId.
Now, you need the roomId to join a room, that you have grabbed earlier
Next, generate an authToken using an HTTP request package. You need to make a POST request to the endpoint which was generated when you created your app on the 100ms dashboard.
import 'dart:convert';
import 'package:http/http.dart' as http;
class TokenService {
static const tokenURL = "https://prod-in.100ms.live/hmsapi/himanshu.app.100ms.live/api/token";
static const defaultRole = "listener";
static Future<String?> getToken({required String userId, required String roomId, String? role}) async {
http.Response response = await http.post(Uri.parse(tokenURL),
body: {'room_id': roomId, 'user_id': userId, 'role': role ?? defaultRole});
var body = json.decode(response.body);
return body['token'];
}
}
Additionally, the request body requires the room_id, user_id, and role fields to generate the token. At this point in this tutorial, these fields are already available to you.
You can join a meeting now as you have all the required fields for the config object. So go to your UserDetailsInputView and update the Join Room button’s onPressed event like below:
User user = User(
userName: usernameTextEditingController.text,
userRole: userRoleTextEditingController.text,
userId: const Uuid().v1());
const roomId = '618d48acbe6c3c0b351510e0';
Navigator.push(
context,
MaterialPageRoute(
builder: (_) => Provider<MeetingStore>(
create: (_) => MeetingStore(),
child: RoomView(
roomTitle: 'Room Title',
roomId: roomId,
user: user),
)));
This will navigate you to a new screen where you are passing your user details, room id, and room title.
Next, in your RoomView widget, add the following code in your _RoomViewState:
class _RoomViewState extends State<RoomView> {
// 1
late MeetingStore _meetingStore;
@override
void initState() {
super.initState();
// 2
_meetingStore = context.read<MeetingStore>();
// 3
MeetingController meetingController = MeetingController(
roomUrl: widget.roomId, flow: MeetingFlow.join, user: widget.user);
_meetingStore.meetingController = meetingController;
// 4
initMeeting();
}
// 5
initMeeting() async {
bool ans = await _meetingStore.joinMeeting();
if (!ans) {
Navigator.of(context).pop();
}
_meetingStore.startListen();
// 6
if (widget.user.userRole == 'listener') {
if (_meetingStore.isMicOn) {
_meetingStore.toggleAudio();
}
}
}
...
}
In the above code:
initMeeting: Method in which you are using the _meetingStore object to join the meeting. If joined, then you are starting to listen to the changes in the meetingBuild and run your app. Now, you have joined the meeting/room ✨

This will trigger the onJoin event, and your app will be provided an update from the 100ms SDK.
✅ If successful, the fun onJoin(room: HMSRoom) method of HMSUpdateListener will be invoked with information about the room encapsulated in the HMSRoom object.
❌ If failure, the fun onError(error: HMSException) method will be invoked with exact failure reason.
A peer is an object returned by 100ms SDKs that contains all information about a user - name, role, track, etc. And now, you need to display your peers.
So, update the build method of your RoomView by wrapping it by Observer to rebuild the method on any changes, like below:
Expanded(
// child: const Text('Waiting for other to join!')
child: Observer(builder: (context) {
// 1
if (!_meetingStore.isMeetingStarted) return const SizedBox();
// 2
if (_meetingStore.peers.isEmpty) {
return const Center(
child: Text('Waiting for other to join!'));
}
// 3
final filteredList = _meetingStore.peers;
return GridView.builder(
itemCount: filteredList.length,
itemBuilder: (context, index) {
return GestureDetector(
onLongPress: () {
},
child: Padding(
padding: const EdgeInsets.all(4.0),
child: CircleAvatar(
radius: 25,
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
// 4
Text(
filteredList[index].name,
style: const TextStyle(fontSize: 22),
),
],
),
),
),
);
},
gridDelegate:
const SliverGridDelegateWithFixedCrossAxisCount(
crossAxisCount: 3));
}),
)
In the above code, you did the following:
Build and run your app to see the peers:


To display the Audio Status add another text below the peer name text as follows:
...
Text(
filteredList[index].name,
style: const TextStyle(fontSize: 22),
),
Text(
filteredList[index].role!.name,
style: const TextStyle(fontSize: 14),
)
...
Here you used displayed the role name using the peer object.

To mute or unmute your mic, update your mic button as follows:
// 1
Observer(builder: (context) {
return OutlinedButton(
style: OutlinedButton.styleFrom(
backgroundColor: Colors.grey.shade300,
padding: EdgeInsets.zero,
shape: const CircleBorder()),
onPressed: () {
// 2
_meetingStore.toggleAudio();
},
// 3
child: Icon(
_meetingStore.isMicOn ? Icons.mic : Icons.mic_off));
})
Here you updated the button as follows:
onPressed event to toggle the mic using the _meetingStore.isMicOn boolean method to check and update the IconData accordingly.To add the ability to chat with everyone in a current meeting you need to update your ChatView widget.
First, accept the MeetingStore object in your ChatView constructor from the RoomView as below:
final MeetingStore meetingStore;
const ChatView({required this.meetingStore, Key? key}) : super(key: key);
Next, store this object inside your _ChatViewState as below:
late MeetingStore _meetingStore;
@override
void initState() {
super.initState();
_meetingStore = widget.meetingStore;
}
Next, update the body to render the messages as below:
Expanded(
// child: const Text('No messages')
// 1
child: Observer(
builder: (_) {
// 2
if (!_meetingStore.isMeetingStarted) {
return const SizedBox();
}
// 3
if (_meetingStore.messages.isEmpty) {
return const Text('No messages');
}
// 4
return ListView(
children: List.generate(
_meetingStore.messages.length,
(index) => Container(
padding: const EdgeInsets.all(5.0),
child: Column(
crossAxisAlignment: CrossAxisAlignment.start,
mainAxisSize: MainAxisSize.min,
children: [
Row(
children: [
Expanded(
child: Text(
// 5
_meetingStore
.messages[index].sender?.name ??
"",
style: const TextStyle(
fontSize: 10.0,
color: Colors.black,
fontWeight: FontWeight.w900),
),
),
Text(
// 6
_meetingStore.messages[index].time
.toString(),
style: const TextStyle(
fontSize: 10.0,
color: Colors.black,
fontWeight: FontWeight.w900),
)
],
),
const SizedBox(
height: 10.0,
),
Text(
// 7
_meetingStore.messages[index].message
.toString(),
style: const TextStyle(
fontSize: 16.0,
color: Colors.black,
fontWeight: FontWeight.w300),
),
],
),
decoration: const BoxDecoration(
border: Border(
left: BorderSide(
color: Colors.blue,
width: 5,
),
)),
),
),
);
},
),
)
In the above code:
After this, you are ready to render the incoming messages, however, this willl not display anything as we are not allowing the peer to send a message yet.

So update the onTap event of the Send icon button of the ChatView as below:
// 1
String message = messageTextController.text;
if (message.isEmpty) return;
// 2
DateTime currentTime = DateTime.now();
final DateFormat formatter =
DateFormat('yyyy-MM-dd hh:mm a');
if (valueChoose == "Everyone") {
// 3
_meetingStore.sendMessage(message);
// 4
_meetingStore.addMessage(HMSMessage(
sender: _meetingStore.localPeer!,
message: message,
type: "chat",
time: formatter.format(currentTime),
hmsMessageRecipient: HMSMessageRecipient(
recipientPeer: null,
recipientRoles: null,
hmsMessageRecipientType:
HMSMessageRecipientType.BROADCAST),
));
}
messageTextController.clear();
Here you did the following:
messageTextController TextEditingController.sendMessage of the _meetingStore object to Send the Broadcast messages.HMSMessage object.Build and run your app. Now, you can interact with everyone in the meeting ✨

If you want to remove some peers from the meeting, you can just use the removePeer method.
Go back to the RoomView widget and update your onLongPress event as below:
onLongPress: () {
_meetingStore.removePeer(filteredList[index]);
}
N.B., You can’t just remove others from a meeting. That’s why you created the custom roles at the beginning of this tutotrial.

To leave the room, go back to RoomView and update the onPressed method of Leave Quietly button as below:
onPressed: () {
_meetingStore.meetingController.leaveMeeting();
Navigator.pop(context);
}
Here, you are using the MeetingStore object to use the meetingController to leave the room.

Finally you have covered the basic functionality and are ready to use these skills in your own projects.
You can find the starter and final project from here. In this tutorial, you learned about 100ms and how you can easily use 100ms to build an audio room Application. However, this is only the beginning, you can learn more about changing roles, changing tracks, adding peers, direct chat with another peer, and much more from here.
We hope you enjoyed this tutorial. Feel free to reach out to us if you have any queries. Thank you!
Like what you’re reading?
Get Audio/video engineering tips straight into your inbox