LIFX MCP Server with Claude Chat on Railway

July 26th 2025

Hello kind people of the internet. This blog is a continuation of my development efforts with MCP Servers using the Lifx Light bulb as a IoT testing platform.

At the end of this project you should be able to deploy a backend server to Railway that will allow a client side application to control a lightbulb using a Claude AI chat bot. The pic below shows the client app running on both my desktop and my phone.

TL;DR

Server

The server source code is up on GitHub at:
https://github.com/tenace2/LifxMCPServerBackend
...I'll briefly explain how to deploy to Railway in the details below,
...but I will not be suppling a link to my actual on-line server.

Client

The client source code is up on GitHub too: https://github.com/tenace2/LifxFrontEnd
...and the client code has a GitHub Pages implementation as well,
...so you can just use my client app to exercise the backend server if you don't want to deploy your own client.
https://tenace2.github.io/LifxFrontEnd/

Minimum requirements


Background

Original Project: Was local only
As part of my digital garden, there is another blog about developing the same basic app (a MCP based chat control of a Lifx lightbulb), for both the client and the server as a single project that both run locally with npm run dev.
This original project was designed to only run locally

https://my-digital-garden-vercel-orpin.vercel.app/claude-ai-mcp-lifx/claude-ai-mcp-lifx-lights/

This locally run code is also up on GitHub (read blog for link) and is a quick way to get started with MCP servers and Claude chat.

Railway backend server

This next iteration splits the backend server into it's own project and then deploys the server code to Railway. And thus, you can control your IoT Lifx lightbulb from anywhere on earth (not just locally).

The README.md docs on the GitHub repository will explain the details for deploying the server. This blog is to explain the background of what's going on.

Why Railway?

Once you clone/copy the server code to your own repository, the server code on GitHub will not run on GitHub Pages, because Pages is a static site hosting service and Pages will not facilitate server functionality. (While this seems obvious, it needed saying). So, you will need a hosting site, and there are a lot of them...I chose Railway.

Railway will host servers and best of all it lashes up directly to your GitHub repository.

Easy Peasy, lemon squeezy. It's dead simple. It just works. Amazing.

Also Railway has a nice dashboard, showing usage and activity. For $5 bucks a month for a hobby user, this is really great.

High-level: What's going on?

Maybe your new to this realm, so I'm going to attempt a real high level explanation,
...leg-bone, knee-bone, connected to the foot-bone.

Hopefully you can follow the above diagram. I'll attempt to narrate reading from left to right.

And you can then control your LIFX lightbulb via an MCP server using the Claude Chat Bot via a browser based app! Works on your desktop, and it also works on your phone.

Server Code

Here again, is the link to the GitHub source code for the server.
https://github.com/tenace2/LifxMCPServerBackend

Please reference the project link above, as there is already copious documentation. Needless to say, the backend Server code was developed in tandem with the client side application. Below is a very simplified diagram of the server functionality.

Hopefully you can follow the above diagram. I'll attempt to narrate reading from left to right.

MCP Server Manager

The mcp-server-manager.js file is an Express.js server (https://expressjs.com/)
It controls the two main features of the overall app:
- the api calls to the LIFX api, which controls the light
- and the Claude chat capability which allows for conversational control of the light.

CRUCIAL NOTE: CORS set local or to my client on GitHub Pages

This sever is one of my first attempts to deploy to Railway, and does not have any Railway environment variables set up on Railway to flexibly handle other ALLOWED_ORIGINS.

My client app is available on GitHub pages:
https://tenace2.github.io/LifxFrontEnd

My front end client source code is available on GitHub:
https://github.com/tenace2/LifxFrontEnd

If you create your own front end client for this server you can alter the CORS
section of this code in the mcp-server-manager.js file at about line 60, a snippet
is provided below. As you can see below this server is very restricted.

// CORS configuration
const corsOptions = {
	origin: process.env.ALLOWED_ORIGINS?.split(',') || [
	'https://tenace2.github.io',
	'http://localhost:9003',
	'http://localhost:5173', // Added for local client testing
	],

Lifx MCP Server

The file lifx-api-mcp-server.js file is like most all MCP Servers, just a wrapper around an REST based API.

Here's a link to the LIFX API documentation so you can review:
https://api.developer.lifx.com/reference/introduction

This Model Context Protocol (MCP) "wrapper" really means an extra layer of information so that a LLM (Large Language Model) can comprehend what's going on with the API calls.

Lifx API key

You will of course need a LIfx API key...these are free! Here's a link:
https://api.developer.lifx.com/reference/how-to-use-the-following-examples

I recommend installing the Lifx app on your phone (which creates an account) and lashing up your light bulb first to your Wifi and getting things working...this will make getting the API key to work a whole lot easier.

Claude API

The file claudeAPI.js deals with the chat interface to the Claude API. The documentation on my GitHub repo goes into more detail regarding the set up for Claude.

Here's a link to the Claude chat API so you can review:
https://docs.anthropic.com/en/api/overview

Claude API key

You will need a Claude API key. Here's a link to the page where you can get a key:
https://docs.anthropic.com/en/api/admin-api/apikeys/get-api-key

Sorry, these API keys from Claude are not Free because they use Tokens.
But good news: they have a $5 dollar burger deal! It's tasty and sooo satisfying!

All joking aside, Anthropic offers a $5 dollar getting started deal, which expires after one month. It's a screaming deal. I used up literally thousands and thousands of tokens testing and flogging on this project until it confessed it's bugs, quirks and perplexing behaviors and at the end of the month my balance still had plenty of money left on the account.

As proof here's a pic of my Claude api dashboard for July...as you can see, it's barely cost anything, a mere $1.67 despite the seemingly high usage of 400K+ tokens.

Detailed Logs

There are of course detailed activity logs with input and output tokens.
Request 1

Claude API payment

This was fairly easy to control, as you can just buy $5 at a time using Strip,
...and consume tokens as needed, thus no worries of waking up some day with a huge $ Bill.
Note: when you run out of money your API will get a http 500 error message.

Client code

There's a bunch going on in the client, which are outlined below.

If you want to more about how the client app is put together read the documentation on the GitHub repo.

But here's what's crucial for getting things lashed up:

To Get Started

Server and Session Management

Input your server URL to the client.
Assuming you cloned the server repo to your desktop VS Code IDE, simply run npm run dev and the server will launch locally.
Then (if running a local server) input the local URL to the client: http://localhost:3001

Demo Key
This LifxDemo key needs to be provided to the backend server by the client. This was just a trial idea to help prevent possible abuse of the backend server if the Railway URL was discovered independently from any information about the GitHub repo, or about the client. My thought was I'd might make this an environment variable up on the Railway backend server, so as to make this a configurable and changeable variable. TBD.

API Keys

Get you LIFX and Claude Api keys, then paste these keys into the app.
Note that you can test the LIFX key with the button (but button does not test the Claude key).

You should be good to go at this point.

Other Really Good Client Stuff

The overall project was approached from the stand point of demonstrating and learning about MCP Servers and Chatbot behavior.

Server Logs

The backend server uses the Winston server log library (...I wish I had gone with Pino, but that's for the next project). The pic below illustrates that server logs are pulled into the client depending if your configured your client:

Server-Session Management

You can choose the server URL that you want to run from.

Server and Session Management

This is also illustrated above. [[#Server and Session Management]]

Session Management
Note that you will be able to see your client session ID, which is very useful when reviewing server logs in the Railway dashboard (or in your terminal window if running the server locally).

API Settings & Management

This is illustrated above. [[#API Keys]]

Quick Light Controls

This does not use the Claude Chat capability,
...it only uses the LIFX MCP api features.

So, if something goes haywire with the Claude chat, you can always test the light with this part of the app.

Claude AI Chat Control

This is where things get interesting. The Claude API is not what I originally thought it was.

No API context memory
The current Claude API that I'm using as of July 2025 is a one-shot thing, with no memory of previous api requests. I don't know of a way to have the API "remember" previous REST api requests so it can have some type of context.

System Prompts

To provide some guardrails for insuring the app is used for controlling a LIFX light bulb, a "pre-prompt" is sent to Claude.

The client app also has this same informational explanation for what these prompts do and provides a way to turn these prompts on and off. You can get very interesting results disabling pre-prompts, and I highly recommend that you noodle around with this feature.

The prompt below is sent to Claude with every API request from the backend server, if the System Prompt is Enabled:

General Prompt - Strict Prompts

If the System Prompt is Disabled in the client app, the General prompt below is hard coded into the server app and is always sent to Claude by the backend server.

This general prompt lets the LLM (Claude) know about the MCP Server. This "strict" prompt is baked into the backend server code...without this strict prompt I found the app pretty much wouldn't work, as Claude would not even know there is a MCP Server to control a light with.

But note: this general prompt allows Claude to answer general questions about pretty much anything...so be careful.

I suppose you can fiddle with the server code yourself and comment out and disable the strict prompt and see how the app behaves...but per my testing, it breaks the app.

Keyword Filter

The apps client chat needs to contain at least something to do with controlling a light, or it won't be submitted to the Claude. I implemented this to limit eating up Claude tokens if you don't at least use words like "light" or "color", etc. in the chat.

You can turn this on and off and see how the app behaves. No doubt there are probably ways to jail-break this feature...again, this is a demo and learning app.

Token Settings & Statistics

This helps you see what your Claude chat bot costs are as you interact with the app...and for just controlling a LIFX light bulb, these token costs are dirt cheap. Of course you could also monitor the Claude api console, which is discussed above [[#Claude API dashboard (console.anthropic.com)]]

Throttle your Chat Output!

The Output Token Limit is there for a reason. If you ask Claude to provide the text for all of the plays of Shakespeare it will probably attempt to do just that. So, if you turn off the System Prompt Control you can get unbounded output, and can thus consume all your Claude api tokens (read below: Fun with Claude Chat).

Claude Chat activity and Input

There are a couple of things going on in this component.

Fun with Claude Chat

You can have some fun with this!
...first I suggest you turn off your System Prompt,
...then ask for something ridiculous, such as turn light pink and get a recipe for potato salad.

Also suggest changing the Output Token Limits, because if you decrease it to let's say, 100, then the chat will drastically cut off the recipe. But your milage may vary, results are not guaranteed, hence the fun of fiddling around with this stuff.

Acknowledgements, Credits and Disclaimers

Disclaimer:

Hey...I make no claims this software works or doesn't work...it is provided as is. I've tried to make the software simple, understandable and secure, and I flogged on the Claude agent to review the code base (for both the server and the client) to look for flaws. There is of course always more I could do...but sigh, life is short, and the burger image above made be realize it's after lunch time.

Acknowledgements:

Burke Holland - Roll Your Own MCP!!!
I just have to mention this guy...he has some great stuff. The following video got me started down this MCP rabbit hole.

Highly suggest review this video:
https://www.youtube.com/watch?v=yUaz89m1M5w

Burke explains how you can use the CoPilot Agent mode to roll-your-own Lifx MCP server by:
A. copying the MCP protocol guidelines into an markdown file.
B. and then copying the Lifx API specs into an markdown file.

Then directing the Claude AI Agent to review both the md files and generating an MCP sever. Presto: like magic you have an MCP Server dedicated to the LIFX Lightbulb that any chatbot can interact with. The effort produced a working MCP server that I used in my first project.

*And the real point of Burkes video is that if you see any REST based API out there that you want to lash up any type of LLM, you can make it happen fast.

Credits:

The MCP server actually used in this project is from James Furey seems to be working great and I only made a few very small changes. Note that James also used AI to generate this MCP server. I found this up on MCP.so web site, and shamelessly borrowed it for this project.

https://mcp.so/server/lifx-api-mcp-server/furey?tab=content

GitHub link to James Fuery LIFX server repo:
https://github.com/furey/lifx-api-mcp-server

Highly suggest reviewing the README from James that is within the MCP server folder...it's very well put together.

Note that I had the Claude AI compare my previous roll-your-own Lifx MCP sever (that Burke Holland outlined how to create) and James Furey's MCP server was clearly the winner. Thank you James!