News Feed
  • DrugHub has agreed to fully refund all users who lost money in the SuperMarket exit scam.  
  • Retro Market has gone offline. Circumstances of the closure unknown.  
  • SuperMarket has closed following an exit scam by one of the admins.  
  • The admin of Incognito Market, Pharoah, has been arrested by the FBI several months after exit scamming.  
  • Silk RoadTorhoo mini logo
  • darknet markets list
  • Popular P2P exchange LocalMonero has announced it is closing.  

A few questions about DNMs : programming | Torhoo darknet markets

Already experienced, but not anything related to onion/website hosting. Feel free to answer technically.

1. What database solution do they use (typically)? Something like MySQL would be great to use, but it's not very scalable.
2. How do mirrors work / how do you set them up? I know that load-balancing is done via onionbalance, but most DNMs have a couple mirrors that they use that are on different addresses and appear to be hosted seperatley but share the same information. Are they just re-hosting the site and connecting to the same database from another machine?
3. How are developers trusted to work on a DNM? Wouldn't they just swindle all of the reserve money and leave, since there's literally no consequences?
/u/FraudBay
1 points
6 months ago
1) PostgreSQL
2) TomCat with session replication.
/u/minnow 📢
1 points
6 months ago
Can tomcat still be used if Apache / PHP is not used?
/u/FraudBay
1 points
6 months ago
You can pair nginx with tomcat through local reverse proxy, works like a charm
/u/kurogiri
1 points
6 months ago
[removed]
i thought there is always a several frontend servers and one backend server like tor -> revers proxy -> frontend server -> backend server. and the db on a another server like if the marketplace it big
/u/kurogiri
1 points
5 months ago
[removed]
Got it, thanks for explaination! I agree that a single backend server makes sense, but I always though you need multiple frontend servers for load balancing and availability. Your solution with one backend and multiple DDoS fronts actually sounds more efficient, specially when it comes to not having to update each frontend separately. Sounds like a good balance.
Do you think working with RESTful APIs would be useful here to make the communication between front and backend more flexible? Would that bring some advantages in this architecture?
Actually, with Kubernetes isn’t it possible to update all frontends at once even if you have many separate servers? I thought you could use it to manage all deployments centrally and push updates to all nodes simultaneously. Wouldn’t that solve the issue of having to update each frontend manually?
/u/goldfinchh 🍼
1 points
2 weeks ago
how to make a frontend server without JS using only CSS and HTML for frontend?
Which programming language do you use? Honestly, everything works if you understand how it functions. I don't see how it wouldn't work. For example, JavaScript can be used effectively on the backend. While PHP is commonly used, you can actually utilize any programming language that operates on the backend.
/u/goldfinchh 🍼
1 points
1 week ago
do u mean generating html templates or what?
No why generating templates just write it.
/u/drisdane [MOD] WebDev│SysOp
1 points
6 months ago
> MySQL would be great to use, but it's not very scalable.

What are you going to scale? Seriously. You want a slave-master for backup perhaps, super easy. But you'll only have to worry about stuff like sharding if you're going to handle massive loads of data. Perhaps something the size of Dread might have considered this, but I haven't seen much else on the darknet that big.

I've been here shortly, but I've already seen a few who want to start a small market or shop and worry about scalability. Sure keep it in mind, but also consider that you probably won't have to start doing it for a long time, that is if your project even ever gets serious momentum.

Oh and on point 3 you can trust me as well ;)
/u/ran
1 points
6 months ago
1. MySQL or PostgreSQL with replication
2. Each onion address could be exposed on an edge server. Edge servers should be replaced frequently to make traceability harder. Each edge server should have DDoS protection, WAF and load-balancing. After sessions has been accepted traffic could be routed to frontend servers that render html/css pages. The frontend servers gather data from backend servers. The important thing is the multi-layered design which could be deployed in one single server or in a multi-layered cluster depending on traffic and threat model.
/u/goldfinchh 🍼
1 points
2 weeks ago
how to make servers replacing? Is there any tools for this, that help with replacing every week for example? Is that a good practice for the privacy replacing servers every week/month etc. to make DNM less traceable?