The exciting world of game development continues as I create today for your viewing pleasure: a piece of plywood.
For texture mapping, I created the following image which I can tile on the y-axis. Help yourself if you want it!
The exciting world of game development continues as I create today for your viewing pleasure: a piece of plywood.
For texture mapping, I created the following image which I can tile on the y-axis. Help yourself if you want it!
Did more coding work on my meditation stations and had to remind myself how to broadcasting messages. The reason for this is because part of the meditation station is actually attached to the player, allowing me to detect where the hands are positioned, where as the other is attached to the station, allowing me to make sure the player is standing in the right spot.
I hope to be able to show a video of the full effect soon.
So I’ve been working on a few different ways to dynamically turn on particle effects based on user interaction. Right now I’m working on a particle effect that turns on when you walk into a specific area.
I found a handy script to increase bounciness beyond 1.0 which the author called a “Power Bounce” but we can call it a “Super Bounce”. It works really well.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class HeadButt : MonoBehaviour
{
public float str = 0.21f;
public AudioClip HeadButtAudio;
// Start is called before the first frame update
void Start(){}
// Update is called once per frame
void Update(){}
void OnCollisionEnter(Collision col)
{
Debug.Log("Hit!");
if (col.gameObject.tag == "buttable")
{
Rigidbody rb = col.gameObject.GetComponent<Rigidbody>();
rb.AddForce(rb.velocity * str, ForceMode.Impulse);
}
SFXPlayer.Instance.PlaySFX(HeadButtAudio, transform.position, new SFXPlayer.PlayParameters()
{
Pitch = Random.Range(0.9f, 1.1f),
SourceID = -1,
Volume = 1.0f
}, 0.0f);
}
}
The only thing left to do at a later date would be to try to accelerometer values from the headset and link that to the str (strength) parameter in the script.
Fixing the collision issues wasn’t too difficult. I just had to shuffle the layers around so that grabbable things don’t collide with the player.
Next I implemented the basic mechanical buttons that are already built into the template I’m using. From a UX perspective I figured it would be feel better to have physical mechanical buttons rather than rely only on UI canvases. I’m also creating a cable that I want the player to be able to plug into their arm to activate the computer screen.
Lastly I got this idea of adding a collider to the head to allow for headbutting interaction. I built a little test zone to try it out.
It works but I want the player to really be able to pop it up when they headbutt the ball but to do that I need to figure out a way to amplify the force that the player moves the Oculus headset with. Increasing the bounciness alone isn’t enough as the maximum is 1. Possibly something like the script here?
I got the elevator working!
The following additional article was helpful for me: https://jackpritz.com/blog/unity-colliderrigidbody-setups-in-xr
Unfortunately, everything else doesn’t work because if I pick up an object it immediately starts colliding with me and I get launched into the air like I’m a team rocket villain at the end of a Pokemon episode.
I will need to deal with physics/collision layers.
I’m also working on some other curious interactive content. Stay tuned for that!
I had a dream the other night about riding in a hot air balloon and I thought that might actually be a really cool premise for a VR game! Although I have other plans for feature game content I thought it would be a fun experiment to play around with.
To start I want to experiment with the sensation of going up and down and have a way for the player to control it.
I created a simple block that is tied to a static UI slider which the player can grab and lift to up/down.
Right now the VR avatar doesn’t follow any rules of physics so the elevator just passes right through without lifting you up. I found a handy tutorial by BeginnerVR that should help me get things working properly.
It’s been a while since I’ve used the GUI tools so I practiced by making a simple HUD that tracks how many hits you made on the ball with the stick.
Basically just count each time you hit the ball and update the HUD. Something to keep in mind for future is that HUD breaks immersion pretty hard, I think its better to have things displayed on a screen or scoreboard whenever possible. But its still good to know how to do it.
I am planning to experiment with things like goggle fog, frost, screen cracks, etc.
After a decent amount of finagling I got the tetherball working. The rope even wraps around the post which was the best I could have hoped for.
The rope can’t handle extreme changes in velocity but for a simple solution it works quite nicely.
Using Jacob Fletcher’s Line Renderer script, here are the settings I used for my tether rope in Unity to get it working in VR (the scale is much smaller than the defaults):
As a side-note, I was so immersed that when I finished I turned around and put my controllers down ontop of a wall I had built in VR without realizing that there was no table there in real life! But I had placed a TV table there earlier and it happened to be at about the correct height. Talk about blending RL and VR!
Today I worked on implementing interactive tetherball. I’m using a simple LineRenderer script I found that creates ropes dynamically. It’s going terribly well so far!
I think there’s some internal collisions happening that I need to figure out. It seems to also be morphing the tetherball on the end for some reason.