Key Highlights
- The facility features 18MW of power with redundant feeds, Tier III maintainability, and space for future buildouts, addressing Manhattan’s power constraints while supporting high-density workloads.
- Its vertical infrastructure includes full-floor cooling, fiber risers, and rooftop antenna arrays, enabling seamless connectivity and wireless communication for urban edge applications.
- Sabey emphasizes security through physical controls, elevated positioning to mitigate flood risk, and a heavily staffed, operationally disciplined environment ensuring high reliability and resilience.
- The building’s strategic location near Wall Street and dense network infrastructure makes it ideal for latency-sensitive financial and AI inference workloads, supporting real-time decision-making.
- Ongoing upgrades and flexible design support diverse customer needs, from colocation to private suites, with future capacity already being prepared for AI inference and hybrid cooling configurations.
At 375 Pearl Street, Sabey Data Centers has built something that feels increasingly rare in the urban core: a data center that is not just surviving Manhattan’s constraints, but turning them into advantages.
SDC Manhattan, Sabey’s facility at the foot of the Brooklyn Bridge, has long been positioned as a low-latency, carrier-dense interconnection site in the financial heart of New York. The company’s own materials describe it as a 222,000-square-foot mission critical facility with 18MW of critical capacity, 23 carriers, and a rooftop antenna array, all wrapped inside a 32-story mixed-use and office tower. The site is also marketed around security, operational efficiency, and proximity to Wall Street.
But a recent walkthrough of the facility with Atim Bajrushi, Customer Solutions Specialist at SDC Manhattan, made clear that the real story of 375 Pearl is not just about specs. It is about how Sabey has adapted a vertical Manhattan telecom landmark into a modern, resilient, highly connected data center platform.
And now, according to a Sabey press release issued December 9, 2025, the company is explicitly pushing that platform toward a new role: a Manhattan hub for advanced AI inference workloads.
That direction fits what the building already is.
A data center built around location, connectivity, and physical reality
The basic pitch for SDC Manhattan is straightforward. The facility sits within walking distance of Wall Street and dense concentrations of network infrastructure, making it a natural fit for latency-sensitive workloads, particularly those tied to finance and other real-time applications. Sabey’s brochure emphasizes that positioning directly, describing the building as “the perfect data center for your low-latency, fintech needs,” while also stressing its dense connectivity options and unusual security posture, including its location within NYPD barricades.
Bajrushi underscored that same point during the tour, but in more practical language. He noted that the site is “very unique,” adding that it is close to Wall Street, can function as an edge location for many customers, and can interconnect to other sites through carriers as needed.
That edge role is increasingly important. In the December 2025 release, Sabey said SDC Manhattan is becoming a premier site for organizations running AI inference workloads in the city, arguing that its combination of dense connectivity, scalable power, and flexible cooling makes it well suited for latency-sensitive, high-throughput systems. The press release framed the facility as a place where enterprises can deploy inference clusters close to end users, reducing response times and supporting real-time decision-making.
That claim is not hard to see once you walk the building.
The vertical campus idea
What makes 375 Pearl different is that it behaves less like a conventional suburban data center campus and more like a vertical campus, with infrastructure stacked floor by floor inside a high-rise shell.
The Sabey brochure lays this out visually, showing a building with:
- cooling towers and condenser water plant at the top
- office and mixed-use floors above parts of the tower
- multiple data center floors
- dedicated UPS and DC operations space
- generator and chiller plant levels
- Con Edison substations and future generator space on lower floors
- fuel and water storage in the cellar
Bajrushi’s tour added texture to that diagram. On the lower infrastructure floors, he pointed out the switchgear rooms, Con Edison feeds, chillers, and the ring bus arrangement that ties the utility service together. Higher up, he walked through generator alley, UPS rooms, operations space, the colocation floors, meet-me-room infrastructure, and finally the roof, where the cooling towers and antenna systems make the building’s role as both data center and communications hub immediately visible.
At one point Bajrushi described the building’s vertical shafts, or what the team calls “bustles,” as one of the site’s most important physical assets. He explained that these shafts run the full length of the building and allow Sabey to route piping, conduit, and fiber cleanly from the cellar to the roof. Bajrushi added that the same arrangement also supports the rooftop wireless customers, who rely on fiber pathways as well as antenna placement.
That kind of infrastructure continuity is one reason the site feels so coherent despite its complexity.
Power: Redundant utility design in a power-constrained market
The tour made equally clear that in Manhattan, power is still the central gating factor.
The brochure describes SDC Manhattan as offering 18MW of aggregate power delivered to the building, backed by redundant electrical and mechanical systems, backup generators, and Tier III-type concurrent maintainability. The December 2025 press release updated that picture in a more market-facing way, noting that Sabey is one of the only colocation providers in Manhattan with available power, including nearly a megawatt of turnkey power and 7MW of utility power across two powered shell spaces.
Bajrushi’s explanation of the electrical topology helped show how Sabey has made that possible.
Standing on the third floor, he described a ring bus tying together four Con Edison feeds. Bajrushi said the feeds all originate from the same substation but take different paths into the building, creating redundancy outside the building as well as within it. He added that if one feed fails, the ring bus remains unaffected, and that only one feed is needed to power everything currently in operation. He also noted that Sabey has the ability to add two more feeds in the future if expansion calls for it.
That matters in a city where available utility capacity is hard to come by and where many data center conversations end not with square footage but with a megawatt number.
Bajrushi also noted that physical space is not the core constraint at 375 Pearl. He said the building still has plenty of room for future buildouts, including open areas that could become additional white space, chiller capacity, or other infrastructure. The bigger question, he suggested, is how and when power and supporting systems get installed.
That observation aligns neatly with Sabey’s press release. The company is effectively arguing that SDC Manhattan has crossed an important threshold: it is not just connected and secure, but one of the few Manhattan sites with enough available capacity to support the next wave of digital infrastructure demand.
Generators, fuel, and maintainability
If the utility side of the story is about redundancy, the backup power story is about preparedness.
Walking the fourth floor, Bajrushi pointed out the building’s large mission critical generators, describing them as the biggest generators at any Sabey site. He explained that the facility maintains multiple fuel tanks in the cellar, with about 72 hours of on-site fuel, fuel polishing capability, and the ability to transfer fuel between tanks if needed. Bajrushi added that Sabey also has delivery contracts in place for emergency refueling.
He described the generator testing regime in similarly matter-of-fact terms. Bajrushi noted that the site performs monthly load testing and an annual “pull-the-plug” test in which the generators are forced to carry the building load. He also walked through the sequence in which the UPS shifts to battery, generators are called to start, and the system remains on generator until operators are comfortable moving back to utility power.
That is the kind of unglamorous operational discipline that tends to separate serious facilities from merely well-marketed ones.
Cooling: Efficiency, economization, and future flexibility
Cooling is another area where the site’s brochure claims and field reality match up well.
Sabey’s materials say the building supports hot aisle containment, new efficient chilled-water CRAH units, and water-side economization. The brochure also says new buildouts can support any rack density, including GPU farm and HPC environments, with liquid-cooling-ready and hybrid cooling options.
Bajrushi’s walkthrough showed the current cooling backbone behind those claims. He pointed to the site’s five York chillers, including a newer unit that had to be craned into the building because it was too large for the elevator. He also highlighted the plate-frame heat exchangers used for pre-chilling in cold weather. Bajrushi explained that when outside conditions are favorable, the chillers can be shut off entirely and the heat exchangers can handle the load rejection needed to maintain temperature.
He also described the major project Sabey undertook to tie the chillers together in a way that allows each one to be isolated individually without disrupting the loop. Bajrushi noted that this was part of making the system concurrently maintainable and said the work had been completed within roughly the last year and a half.
That is an important detail because it shows the building is not static. SDC Manhattan is being actively upgraded, not simply maintained.
The AI angle enters here too. Sabey’s December release says the building’s liquid-cooling-ready infrastructure can support hybrid cooling configurations for GPUs and custom accelerators while maintaining the company’s energy-efficiency standards. Bajrushi was more measured in discussing AI directly, but he confirmed that inference use cases have been discussed internally and that the idea is viewed as a real possibility for the site.
Inside the colo floors
On the colocation floors, the building looks more familiar to data center operators, though even there the vertical context never disappears.
Bajrushi described the room-level cooling approach as a flooded cold-air design with hot air rejected into the ceiling plenum. He noted that customers can choose either full hot aisle containment or chimney-based configurations, depending on workload and design preference. He also pointed out future project space taped off on the floor, indicating that additional capacity is already being prepared for new customer deployments.
The brochure emphasizes that SDC Manhattan is suitable for retail customers, private suites, and powered shell users alike. Bajrushi’s tour reinforced that flexibility. He discussed how one customer had started in a relatively small footprint and later expanded significantly, using that progression as an example of how long-term customers can grow within the building rather than having to leave it.
That customer growth story matters because it suggests the building’s appeal is not just about initial deployment. It is about continuity.
Connectivity remains the real differentiator
For all the talk of power and cooling, the building’s deepest structural advantage may still be connectivity.
Sabey’s brochure lists 23 carriers and emphasizes diverse pathways through north and south risers, dual campus fiber entries, meet-me-room access, and a rooftop antenna array with clear sightlines and available space. The facility currently supports connections to more than 15 Tier 1 network providers, according to the brochure.
Bajrushi put that into operational terms. He showed how fiber comes up through the risers into cable vault infrastructure, splice shelves, carrier equipment rooms, and meet-me rooms. He noted that all customer cross-connects ultimately flow through that ecosystem, and said that he personally manages the cross-connect work for the building’s customers.
He also gave a clearer sense of the rooftop wireless business, saying the antenna deck now supports roughly 150 devices across around 16 customers or carriers. Bajrushi added that some positions on the roof are much more in demand than others because point-to-point wireless requires clear shots to specific endpoints.
Once on the roof, that logic became obvious. The building has unusually clean sightlines across Brooklyn, Staten Island, New Jersey, and large sections of Manhattan. Bajrushi noted that nothing blocks transmission for long distances in multiple directions, which helps explain why the rooftop platform has become such a valuable asset.
This is where SDC Manhattan feels especially timely. As AI inference infrastructure pushes closer to users and applications, dense urban nodes with both fiber and wireless optionality begin to look more strategic, not less.
Security, resilience, and the advantage of elevation
Sabey’s marketing material makes much of the site’s security posture, citing police barricades, controlled street and loading dock access, 24/7 personnel, mantraps, CCTV, secure access checkpoints, and ballistic-rated access points.
Bajrushi’s tour supported that characterization in practical ways. He pointed out security controls at elevator banks, mantraps on data center entrances, tightly controlled movement between the core and data hall areas, and constant monitoring throughout the site.
But just as important as formal security is the building’s physical resilience.
Bajrushi repeatedly returned to elevation as one of the building’s underappreciated strengths. While discussing the infrastructure floors, he noted that major electrical distribution is already dozens of feet above grade and that by the fourth floor the tour was nearing 100 feet above street level. Referring to Superstorm Sandy, Bajrushi said floodwaters never reached the building because of both its height and its position relative to the surrounding terrain.
That is the sort of urban resiliency advantage that is easy to overlook until it matters.
A human-intensive operation
One other thing stood out throughout the tour: this is not a lights-out facility.
Bajrushi described SDC Manhattan as a heavily staffed, unionized operating environment with engineers, security personnel, cleaners, and facilities support all working on site. He estimated that roughly 50 people are involved in the day-to-day operation of the site and noted that Sabey partners with CBRE on management of the office block and engineering support.
He also described a network operations environment in which on-site teams are backed by oversight from staff at other Sabey locations, creating multiple layers of monitoring and control. Bajrushi added that nothing slips through the cracks because so many people are actively watching the systems.
For a Manhattan data center serving long-tenured customers with mission critical needs, that labor intensity feels less like inefficiency than part of the product.
Where the AI inference thesis fits
The most interesting thing about Sabey’s December 2025 press release is not that it mentions AI. Every data center company now mentions AI.
What is more interesting is that the company is specifically talking about inference, not training.
That distinction matters.
Inference infrastructure is about:
- proximity to users
- low latency
- fast response times
- dense network access
- scalable but not necessarily hyperscale deployment footprints
In other words, it is about the exact kind of urban interconnection environment that SDC Manhattan already embodies.
Tim Mirick, Sabey’s president, said in the press release that “the future of AI isn’t just about training—it’s about delivering intelligence at scale,” adding that the Manhattan facility places that capability “at the edge of one of the world’s largest and most connected markets.”
That message lines up with what Bajrushi described during the tour. He noted that the site can function as an edge location, that inference use cases have already come up in conversation, and that the building’s location and network density create a unique opportunity set.
At the same time, Bajrushi was realistic about the physical limitations of a stacked urban site. He noted that water placement and floor-by-floor constraints matter in ways that differ from purpose-built suburban AI campuses. That realism is helpful. It keeps the AI narrative grounded.
SDC Manhattan is not trying to become a giant greenfield GPU training campus. It is something else: an interconnection-rich, power-constrained but still expandable urban node that increasingly looks well suited to inference.
The view from the roof, and the larger takeaway
The roof at 375 Pearl is one of those places that snaps a facility’s logic into focus.
From up there, you can see the cooling towers, generator exhaust paths, antenna infrastructure, and unobstructed views across the surrounding boroughs. You can also see why the building has become such a useful perch for communications networks and why Sabey believes it can play a larger role in next-generation digital infrastructure.
The takeaway from SDC Manhattan is not that it is the biggest or most AI-dense facility in the market. It is not.
The takeaway is that Sabey has assembled something more nuanced and, in Manhattan, arguably more valuable:
- a carrier-rich interconnection point
- a resilient vertical infrastructure stack
- a site with remaining power and deployment flexibility
- a realistic path toward urban AI inference workloads
In a market where new land is scarce, utility timelines are difficult, and latency still matters, 375 Pearl shows what happens when a data center operator treats an old telecom landmark not as a relic, but as a platform.
And on that front, SDC Manhattan increasingly looks less like an adaptive reuse story from yesterday and more like an edge-and-inference story for what comes next.
At Data Center Frontier, we talk the industry talk and walk the industry walk. In that spirit, DCF Staff members may occasionally use AI tools to assist with content. Elements of this article were created with help from OpenAI's GPT5
Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, as well as on BlueSky, and signing up for our weekly newsletters using the form below.
About the Author
Matt Vincent
Matt Vincent is Editor in Chief of Data Center Frontier, where he leads editorial strategy and coverage focused on the infrastructure powering cloud computing, artificial intelligence, and the digital economy. A veteran B2B technology journalist with more than two decades of experience, Vincent specializes in the intersection of data centers, power, cooling, and emerging AI-era infrastructure. Since assuming the EIC role in 2023, he has helped guide Data Center Frontier’s coverage of the industry’s transition into the gigawatt-scale AI era, with a focus on hyperscale development, behind-the-meter power strategies, liquid cooling architectures, and the evolving energy demands of high-density compute, while working closely with the Digital Infrastructure Group at Endeavor Business Media to expand the brand’s analytical and multimedia footprint. Vincent also hosts The Data Center Frontier Show podcast, where he interviews industry leaders across hyperscale, colocation, utilities, and the data center supply chain to examine the technologies and business models reshaping digital infrastructure. Since its inception he serves as Head of Content for the Data Center Frontier Trends Summit. Before becoming Editor in Chief, he served in multiple senior editorial roles across Endeavor Business Media’s digital infrastructure portfolio, with coverage spanning data centers and hyperscale infrastructure, structured cabling and networking, telecom and datacom, IP physical security, and wireless and Pro AV markets. He began his career in 2005 within PennWell’s Advanced Technology Division and later held senior editorial positions supporting brands such as Cabling Installation & Maintenance, Lightwave Online, Broadband Technology Report, and Smart Buildings Technology. Vincent is a frequent moderator, interviewer, and keynote speaker at industry events including the HPC Forum, where he delivers forward-looking analysis on how AI and high-performance computing are reshaping digital infrastructure. He graduated with honors from Indiana University Bloomington with a B.A. in English Literature and Creative Writing and lives in southern New Hampshire with his family, remaining an active musician in his spare time.






