Quantcast
Channel: Photon Voice — Photon Engine
Viewing all 671 articles
Browse latest View live

Magic Leap Voice DemoMagicLeap Scene must connect twice to get voice both directions

$
0
0
Hi we are integrating PUN Voice for Magic Leap. After some fiddling we have things working (yay!) Note that forcing (hard coding due to demo implementation), a custom room name PhotonStatusDisplay.cs helps ensure a match. An observation for those that are exploring: Using Magic Leap Remote with Unity to test voice (using an actual ML1) does not work. You have to push the app native to the headset to get a working voice connection. Q?: Is this currently a known limitation or something related to Unity setup? When connecting two devices to a room, for the 2nd connection (the client), upon first connection, the master hears the client's sending audio, but the client does not hear the master's unless one of the two has left and re-joined the room, at which point audio is bi-directional. Q?: Is this a known bug or do you have any recommendations on how to fix that so it works on the initial connection? Could this be a firewall or port forwarding issue? Thanks for your advice.

PUN Voice DemoMagicLeap Scene feature request

$
0
0
Hello, This is intended for whomever has authored the Magic Leap integration SDK (beta) Scene but perhaps anyone that knows Voice SDK and Unity can answer: We have implemented the DemoMagicLeap Scene and it works well. We have included other GameObjects in the shared experience. However, making the controller shared has proven difficult: the controller visualizer will not share like the VoiceAvatar does (the head with speaker attached). I think it is because the controller is a special type and/or the controller connection handler scripts basically override the PUN visualizer scripts. The goal is to have everyone in the shared room see each other's controllers the same way they see the VoiceAvatar (with a laser pointer activated from the controller, which we do now). Presently, we are sharing a laser projected from the player's VoiceAvatar's head. The controller will not share across the scene properly. To whomever authored the demo scene: can you please consider extending to demonstrate how to share the controller specifically, or provide a hint on how to achieve that? Can you confirm my suspicion is does not work because it is a special type? And instead we have to write a custom script to get the controller transform/position only and pass that to a GameObject prefab that looks like a controller, is that right? Thanks very much in advance for the advice.

Problems with Photon voice 2

$
0
0
Hi! Introducing to my Unity prototype PhotonVoice2. Pun2 working fine, but voice can't hear each other. I add to scene PhotonVoiceNetwork And to my player prefab, that instantiates for every player And add to PhotonServerSettings voiceAppId What I miss? Or doing wrong?

Photon Voice, WebGL, Multi-threaded Web Assembly and Unity 2019.1

$
0
0
I understand that there is no support for Photon Voice on Unity WebGL. I understood that this is due to the multithreading support necessary for Photon Voice. Now WebAssembly supports multithreading. As I read from Unity 2019 release notes, they are providing experimental multithreaded webAssembly. Is it possible to have Photon Voice on WebGL now? I understand that this can not be in released product form. But any results from experiments would be highly appreciated.

Oculus Avatar lip sync

$
0
0
Oculus Avatar lip sync is not working while connect voice chat? I am using photon voice 2 plugin.

PUN + Voice + Oculus Quest + Unity

$
0
0
Hey, so i got the multiplayer working for Quest and decided to add voice, The voice being transmitted Quest -> Quest is not loud enough even when im at the maximum volume, same is not the issue when i test Editor -> Quest, Sound being transmitted by the editor is heard clearly in the Quest, Same is not the case for voice captured by Quest, is anyone else experiencing this ?

Lip Sync

$
0
0
OVR lip sync is not working while connect to the photon voice?

How to sync voice with OnPhotonSerializeView data?

$
0
0
I'm recording and sending a user's mouth via OnPhotonSerializeView and sending audio via Photon Voice. How can I sync the two up? For the data, I could do something with PhotonNetwork.Time but how would I sync that with the audio? Does the audio have a timestamp=?

Photon Voice not recording

$
0
0
Hello, I created a scene using Photon where each player is represented by a "proxy" Prefab that I instantiate and there is one local player who is linked to his own proxy Prefab. I would like to add communication features to my scene : I added to the local player a Recorder and a Photon Voice Network Components and to the proxy prefab a photon voice view component as well as a speaker component. However I can't manage to record any sound (transmit and isRecording are enabled) and my demo scenes are well working. Could you help find out what issues am I missing ? Thank you for your answers

Photon Voice Mic Volume

$
0
0
Hi all,How do I increase the mic transmit volume with Photon Voice.Voice is working but weak volume.Thanx!

Photon Voice Chat Volume

$
0
0
Hi, My chat volume is very low.I tried this from an old post. At the start of OnAudioFrame method in "PhotonVoiceNetworks.cs" I added these lines:
    for(int i =0; i < frame.Length; i++)
    {
        frame[i] *= (AppSettings.VoiceChatVolume * 5);
    } 
Only the AppSettings gives error( redlined ).The rest of it seems ok. The .VoiceChatVolume is Not redlined. The AppSettings isn't found in the project search. Any ideas? Thanx, Steve

Why PUN does not support communications between photon rooms?

$
0
0
I think it is a very common feature, for example, two persons are friends but now they are not in the same photon room,I want to invite him to join my room.I need send him a message to tell in which photon room I am now, he can just click a button to join me automatically. Without the feature, we must implement the client and server logic by ourselves. I want to know under what considerations this feature is not implemented?

Do photon client and server filter photon voices when the audio source listener is out of scope??

$
0
0
Case A: Only person A and Person B are in a photon room, the max distance of 3d sound settings are all 500, and now the real distance of the two persons are 600, so they cannot hear each other, I want to know that the voice streams are transmitted to the photon server or not?? Case B: 100 persons are in a photon room, the max distance of 3d sound settings are all 500, in the scope of person A only have one person : B, so A can only hear B now, maybe photon server can receive 100 voice streams, but person A can receive all the voice streams or only receive the voice stream from person B??

Echo in voice chat Oculus Go

$
0
0
Hi everyone, I am developing an app on Unity for Oculus Go and when two players talk to each other they can hear their own voice. This problem isn't there where headphones are used. How to get rid of this echo. thanks in advance Rajat

The type or namespace name 'IVoiceFrontend' could not be found (are you missing a using directive o

$
0
0
when import photon voice v2.8, the above error appears.

Audio Processing DSP - Does it remove Echoes? Cannot seem to get it to work...

$
0
0
I have a very similiar issue to this recent forum posting: https://forum.photonengine.com/discussion/13533/how-to-estimate-reversestreamdelayms-in-webrtcaudiodsp-of-photon-voice-2#latest I have a lot of echoes when using speakerphone on cell phone devices with photon voice. If I plug headphones into the phones, there is no echo, and you clearly hear the other party's voice, but when using the speakerphone, it becomes un-useable when both parties/more than one person is speaking and you record your voice while the speaker is playing their voice audio. (Example, in a conference call, so to speak. - Multiple people in the room, speaking at once.) I have attempted to use the DemoVoice scene and my own scene with the WebRtcAudioDsp script, in test builds on Android phones but don't really notice a difference with the WebRtcAudioDsp script (still echoes very badly) , even when using Unity vs Photon Microphones. The WebRtcAudioDsp script has been added to the "Voice Connection and Recorder" GameObject in the DemoVoice Scene from Photon Voice 2 (unity plugin). I have examined the script and I believe it should be working with no further configuration. Is there a way to verify that WebRtcAudioDsp script is actually scrubbing the audio? How can I move forward in making this work, so the echoes are removed? Thanks for any input you might have...

I don't want to hear other persons when the audio source distance is out of scope.

$
0
0
Hi @JohnTube I change the audio source configurations using the following codes and try to shrink the voice distance to 1 meter in the virtual world:
     private void OnEnable()
        {
            speaker = target as Speaker;
            audioSource = speaker.GetComponent();
            audioSource.rolloffMode = AudioRolloffMode.Linear;
            audioSource.maxDistance = 1;
            audioSource.minDistance = 0;
            audioSource.spread = 0;
            audioSource.dopplerLevel = 0;
            playDelayMsSp = serializedObject.FindProperty("PlayDelayMs");
        }
Questions 1: I found that the audio source configurations is not transferred to photon server, and restore this configurations in the clone prefab,, the locally Instantiate prefab and the clone prefab all use the same default values?Is that right?So I can change the above codes and let all the Instantiate prefabs use the same conf values. Question2 :: there are two persons in the virtual world whose audio source max distance are all 1, the effect I want to get is when one person is out of another person's audio source scope, they cannot hear each other.But now the result is we can hear each other, only the volume is smaller and smaller when our distance is further and further.When the distance is reach to 30-40, we can not hear other.So what's wrong with my codes?Or How can I reach the effect?? Thanks.

Voice only working for some users

$
0
0
Hi, i'm using Photon Voice 2 and testing with 4-6 users (including me). Myself and one other person can hear each other fine, but the others in the room can't hear us or each other. The logs show for those users it tries to initialise an empty microphone:
[Recorder.Recorder] [PV] MicWrapper: initializing microphone '', suggested frequency = 24000).
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[Recorder.Recorder] [PV] MicWrapper does not support suggested frequency 24000 (min: 44100, max: 44100). Setting to 44100
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[Recorder.Recorder] [PV] MicWrapper: microphone '' initialized, frequency = 44100, channels = 1.
It also shows that they should be connected to the voice network, so they should be able to hear us?
[GeneralSettings.PhotonVoiceNetwork] PUN joined room, now joining Voice room
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[GeneralSettings.PhotonVoiceNetwork] OnVoiceStateChanged from ConnectedToMasterserver to Joining
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[GeneralSettings.PhotonVoiceNetwork] OnVoiceStateChanged from Joining to DisconnectingFromMasterserver
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[GeneralSettings.PhotonVoiceNetwork] OnVoiceStateChanged from DisconnectingFromMasterserver to ConnectingToGameserver
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[GeneralSettings.PhotonVoiceNetwork] OnVoiceStateChanged from ConnectingToGameserver to Authenticating
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[GeneralSettings.PhotonVoiceNetwork] OnVoiceStateChanged from Authenticating to Joining
 
(Filename: C:\buildslave\unity\build\Runtime/Export/Debug/Debug.bindings.h Line: 48)

[GeneralSettings.PhotonVoiceNetwork] OnVoiceStateChanged from Joining to Joined

Raise Event for Voice connection

$
0
0
Using Raise event option for object instantiation it does not connect voice Please Clarify how to connect voice chat using raise event option. Thanks in Advance.

HTTPS cannot be used by custom authentication.

$
0
0
HTTP is OK, when https is deployed ,the following error appeared: OperationResponse 230: ReturnCode: 32755 (Custom authentication service error: Error). Parameters: {} Server: MasterServer Address:
Viewing all 671 articles
Browse latest View live