Hi,
Would like to know if something like this would be the proper use of DFS.
We have about 400 to 500 gig of data that we need copied locally to 100 workstations.
We were considering using DFS to create a namespace from the server to the workstations.
The idea is that only a portion of the 400 gig is updated every day and since DFS is "smart" the transfers after the intial sycnch should be fast and efficient, is this thinking correct?
In addition, is DFS smart in the sense that it can take advantage of multicasting to limit the data that is moved through the network?
THanks,
George
George